Apr 16 18:03:24.945476 ip-10-0-139-88 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:03:25.381233 ip-10-0-139-88 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:03:25.381233 ip-10-0-139-88 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:03:25.381233 ip-10-0-139-88 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:03:25.381233 ip-10-0-139-88 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:03:25.381897 ip-10-0-139-88 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:03:25.382270 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.382135 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:03:25.387344 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387329 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:03:25.387344 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387344 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387349 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387352 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387356 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387359 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387362 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387365 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387368 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387370 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387373 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387376 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387378 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387381 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387384 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387386 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387389 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387392 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387394 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387397 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:03:25.387409 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387400 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387404 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387408 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387414 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387418 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387421 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387424 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387426 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387429 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387432 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387434 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387437 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387439 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387441 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387444 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387446 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387449 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387452 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387454 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:03:25.387858 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387457 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387459 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387461 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387464 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387467 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387470 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387472 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387475 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387477 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387480 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387482 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387485 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387487 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387490 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387493 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387496 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387499 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387502 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387504 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387507 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:03:25.388431 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387509 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387512 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387515 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387517 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387520 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387522 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387526 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387528 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387531 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387534 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387537 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387540 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387544 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387547 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387550 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387553 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387555 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387559 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387562 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387564 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:03:25.388919 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387567 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387569 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387572 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387574 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387577 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387580 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387582 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387977 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387982 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387985 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387988 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387991 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387994 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387997 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.387999 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388002 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388005 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388007 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388010 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388013 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:03:25.389456 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388015 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388018 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388021 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388023 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388026 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388029 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388032 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388034 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388038 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388041 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388044 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388046 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388049 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388051 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388054 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388057 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388059 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388062 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388064 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388068 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:03:25.389946 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388070 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388073 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388076 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388078 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388080 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388083 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388086 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388088 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388090 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388093 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388096 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388098 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388100 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388103 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388106 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388109 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388113 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388117 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388120 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:03:25.390459 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388122 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388125 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388128 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388131 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388133 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388136 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388139 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388141 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388144 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388146 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388150 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388154 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388157 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388160 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388163 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388166 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388168 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388171 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388174 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388177 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:03:25.390929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388179 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388182 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388184 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388201 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388204 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388207 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388211 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388231 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388235 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388238 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388241 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388244 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388247 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388250 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388318 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388326 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388341 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388346 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388351 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388354 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388359 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:03:25.391444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388363 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388366 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388369 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388373 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388376 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388380 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388383 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388386 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388389 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388392 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388394 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388398 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388403 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388406 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388409 2578 flags.go:64] FLAG: --config-dir="" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388412 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388415 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388419 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388423 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388427 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388430 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388434 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388437 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388440 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388443 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:03:25.391967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388447 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388451 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388454 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388457 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388460 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388464 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388467 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388472 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388475 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388478 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388482 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388485 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388489 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388492 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388495 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388498 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388501 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388507 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388522 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388526 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388529 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388533 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388536 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388540 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388543 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:03:25.392600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388547 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388550 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388553 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388556 2578 flags.go:64] FLAG: --help="false" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388559 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388562 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388565 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388568 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388572 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388575 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388578 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388582 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388584 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388587 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388591 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388594 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388597 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388600 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388603 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388606 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388609 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388611 2578 flags.go:64] FLAG: --lock-file="" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388614 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388617 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:03:25.393232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388622 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388627 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388630 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388633 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388635 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388639 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388642 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388645 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388648 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388652 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388655 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388659 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388662 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388665 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388668 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388671 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388674 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388677 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388680 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388689 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388692 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388695 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388698 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:03:25.393810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388701 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388707 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388709 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388713 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388716 2578 flags.go:64] FLAG: --port="10250" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388719 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388722 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09f477678714bed03" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388725 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388728 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388732 2578 flags.go:64] FLAG: --register-node="true" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388735 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388738 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388742 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388745 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388748 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388750 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388754 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388757 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388760 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388763 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388766 2578 flags.go:64] FLAG: --runonce="false" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388769 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388772 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388775 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388778 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388780 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:03:25.394451 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388784 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388787 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388790 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388794 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388797 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388800 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388803 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388806 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388809 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388812 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388818 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388821 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388824 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388828 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388831 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388836 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388839 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388842 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388845 2578 flags.go:64] FLAG: --v="2" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388850 2578 flags.go:64] FLAG: --version="false" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388853 2578 flags.go:64] FLAG: --vmodule="" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388858 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.388861 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388959 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:03:25.395078 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388963 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388966 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388970 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388973 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388975 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388978 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388981 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388984 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388986 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388989 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388991 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388994 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.388997 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389003 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389005 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389008 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389011 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389013 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389016 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389019 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:03:25.395669 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389021 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389024 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389026 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389030 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389033 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389035 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389038 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389040 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389043 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389045 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389048 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389051 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389054 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389057 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389059 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389062 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389064 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389067 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389069 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389072 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:03:25.396163 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389074 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389077 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389079 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389082 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389084 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389089 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389092 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389095 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389097 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389100 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389102 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389105 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389107 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389110 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389112 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389116 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389119 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389121 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389124 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389126 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:03:25.396667 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389129 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389132 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389134 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389137 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389140 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389142 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389145 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389147 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389150 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389152 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389155 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389157 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389160 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389163 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389165 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389168 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389170 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389174 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389177 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:03:25.397174 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389180 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389182 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389186 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389203 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389207 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.389211 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.389220 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.395391 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.395406 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395467 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395472 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395476 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395479 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395482 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395484 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:03:25.397658 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395487 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395490 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395493 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395496 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395499 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395502 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395505 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395507 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395510 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395513 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395516 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395518 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395522 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395524 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395527 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395530 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395533 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395535 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395538 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395540 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:03:25.398074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395543 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395545 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395548 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395551 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395553 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395557 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395560 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395563 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395565 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395568 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395571 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395573 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395576 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395578 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395581 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395584 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395587 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395589 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395592 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395594 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:03:25.398588 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395597 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395599 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395602 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395605 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395609 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395613 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395616 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395618 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395621 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395623 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395626 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395628 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395631 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395634 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395636 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395639 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395642 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395646 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395652 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395654 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:03:25.399074 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395657 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395660 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395663 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395665 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395668 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395671 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395673 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395676 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395679 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395682 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395685 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395688 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395691 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395693 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395696 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395700 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395704 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395707 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395710 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:03:25.399607 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395713 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.395718 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395816 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395822 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395825 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395828 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395833 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395836 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395838 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395841 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395844 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395848 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395851 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395854 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395856 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:03:25.400091 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395859 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395861 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395864 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395866 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395869 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395871 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395874 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395876 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395879 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395882 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395885 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395887 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395890 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395892 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395895 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395898 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395900 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395903 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395906 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395908 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:03:25.400478 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395911 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395914 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395916 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395919 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395921 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395924 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395926 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395929 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395932 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395934 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395938 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395941 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395944 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395946 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395948 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395951 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395954 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395956 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395959 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395961 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:03:25.401013 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395964 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395966 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395969 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395971 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395973 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395977 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395980 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395984 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395987 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395989 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395992 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395995 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.395997 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396000 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396003 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396005 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396008 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396010 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396013 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:03:25.401529 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396016 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396018 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396021 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396023 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396026 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396029 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396032 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396034 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396037 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396039 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396042 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396044 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396047 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:25.396050 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.396055 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:03:25.402018 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.396688 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:03:25.402415 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.399642 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:03:25.402415 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.400618 2578 server.go:1019] "Starting client certificate rotation" Apr 16 18:03:25.402415 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.400704 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:03:25.402415 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.400740 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:03:25.423832 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.423815 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:03:25.427399 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.427372 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:03:25.445487 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.445467 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:03:25.450433 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.450415 2578 log.go:25] "Validated CRI v1 image API" Apr 16 18:03:25.452880 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.452863 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:03:25.453280 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.453264 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:03:25.457224 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.457178 2578 fs.go:135] Filesystem UUIDs: map[293d2cdd-821e-40a6-a650-23559cf93490:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8b294ccb-e498-4b9e-9782-3ab2ef645d3d:/dev/nvme0n1p4] Apr 16 18:03:25.457299 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.457223 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:03:25.462773 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.462650 2578 manager.go:217] Machine: {Timestamp:2026-04-16 18:03:25.460862911 +0000 UTC m=+0.404475958 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2500004 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23fb4ac8a564f9b6c2b22c5e303878 SystemUUID:ec23fb4a-c8a5-64f9-b6c2-b22c5e303878 BootID:608af499-f2d0-409c-9904-296c7769a23e Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:73:42:58:8f:6d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:73:42:58:8f:6d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:ca:5c:90:17:db Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:03:25.462773 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.462763 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:03:25.462936 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.462879 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:03:25.463957 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.463927 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:03:25.464130 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.463958 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-88.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:03:25.464240 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.464144 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:03:25.464240 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.464157 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:03:25.464240 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.464175 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:03:25.464956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.464944 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:03:25.466729 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.466716 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:03:25.466849 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.466838 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:03:25.469152 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.469140 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:03:25.470337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.470325 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:03:25.470400 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.470353 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:03:25.470400 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.470368 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:03:25.470400 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.470390 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:03:25.471470 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.471456 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:03:25.471550 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.471480 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:03:25.474298 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.474282 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:03:25.477727 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.477705 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:03:25.478411 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.478384 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8rghs" Apr 16 18:03:25.479019 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.478998 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:03:25.479108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479025 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:03:25.479108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479042 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:03:25.479108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479050 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:03:25.479108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479059 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:03:25.479108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479067 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:03:25.479108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479077 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:03:25.479108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479085 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:03:25.479108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479095 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:03:25.479108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479104 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:03:25.479474 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479169 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:03:25.479474 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.479186 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:03:25.480206 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.480173 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:03:25.480270 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.480217 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:03:25.480449 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.480427 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:03:25.480509 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.480454 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-88.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:03:25.483183 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.483164 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-88.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:03:25.483947 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.483933 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:03:25.483984 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.483978 2578 server.go:1295] "Started kubelet" Apr 16 18:03:25.484106 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.484075 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:03:25.484182 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.484071 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:03:25.484235 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.484209 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:03:25.484917 ip-10-0-139-88 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:03:25.485837 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.485810 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:03:25.486351 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.486334 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8rghs" Apr 16 18:03:25.486865 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.486850 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:03:25.491816 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.491796 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:03:25.491908 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.491814 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:03:25.492532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.492515 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:03:25.492532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.492516 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:03:25.492664 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.492546 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:03:25.492664 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.492616 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:03:25.492664 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.492627 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:03:25.493253 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.493233 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:25.494001 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.493985 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:03:25.494233 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.494183 2578 factory.go:55] Registering systemd factory Apr 16 18:03:25.494233 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.494217 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:03:25.494786 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.494760 2578 factory.go:153] Registering CRI-O factory Apr 16 18:03:25.494786 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.494782 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 18:03:25.494907 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.494813 2578 factory.go:103] Registering Raw factory Apr 16 18:03:25.494907 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.494827 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 18:03:25.495253 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.495239 2578 manager.go:319] Starting recovery of all containers Apr 16 18:03:25.495758 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.495719 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:03:25.496688 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.496666 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:03:25.500848 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.500827 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-88.ec2.internal\" not found" node="ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.504713 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.504698 2578 manager.go:324] Recovery completed Apr 16 18:03:25.509967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.509954 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:03:25.512628 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.512608 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:03:25.512689 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.512648 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:03:25.512689 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.512665 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:03:25.513167 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.513153 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:03:25.513167 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.513166 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:03:25.513308 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.513199 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:03:25.516306 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.516292 2578 policy_none.go:49] "None policy: Start" Apr 16 18:03:25.516381 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.516311 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:03:25.516381 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.516324 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:03:25.554805 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.554789 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 18:03:25.565654 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.554822 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:03:25.565654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.554833 2578 server.go:85] "Starting device plugin registration server" Apr 16 18:03:25.565654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.555087 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:03:25.565654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.555101 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:03:25.565654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.555181 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:03:25.565654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.555273 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:03:25.565654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.555284 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:03:25.565654 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.555719 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:03:25.565654 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.555753 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:25.642003 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.641924 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:03:25.643142 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.643109 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:03:25.643142 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.643138 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:03:25.643315 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.643161 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:03:25.643315 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.643170 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:03:25.643315 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.643285 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:03:25.646607 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.646578 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:03:25.655968 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.655954 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:03:25.656744 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.656731 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:03:25.656797 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.656760 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:03:25.656797 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.656770 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:03:25.656797 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.656793 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.665838 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.665815 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.665892 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.665847 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-88.ec2.internal\": node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:25.683828 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.683803 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:25.744126 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.744078 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal"] Apr 16 18:03:25.744251 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.744185 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:03:25.746449 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.746433 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:03:25.746533 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.746468 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:03:25.746533 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.746482 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:03:25.748062 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.748046 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:03:25.748224 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.748211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.748272 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.748241 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:03:25.748829 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.748816 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:03:25.748829 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.748823 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:03:25.748947 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.748842 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:03:25.748947 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.748848 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:03:25.748947 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.748857 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:03:25.748947 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.748863 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:03:25.750447 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.750432 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.750511 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.750458 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:03:25.751226 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.751212 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:03:25.751292 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.751239 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:03:25.751292 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.751249 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:03:25.775967 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.775947 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-88.ec2.internal\" not found" node="ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.780391 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.780376 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-88.ec2.internal\" not found" node="ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.784422 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.784407 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:25.884655 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.884617 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:25.893959 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.893905 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f308e559b92afbe7179caa99acdc8f6c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal\" (UID: \"f308e559b92afbe7179caa99acdc8f6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.893959 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.893933 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f308e559b92afbe7179caa99acdc8f6c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal\" (UID: \"f308e559b92afbe7179caa99acdc8f6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.893959 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.893953 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a22c454fc2ba5924d3a6892717c717ec-config\") pod \"kube-apiserver-proxy-ip-10-0-139-88.ec2.internal\" (UID: \"a22c454fc2ba5924d3a6892717c717ec\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.985596 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:25.985542 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:25.994906 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.994883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f308e559b92afbe7179caa99acdc8f6c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal\" (UID: \"f308e559b92afbe7179caa99acdc8f6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.994958 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.994917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f308e559b92afbe7179caa99acdc8f6c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal\" (UID: \"f308e559b92afbe7179caa99acdc8f6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.994958 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.994934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a22c454fc2ba5924d3a6892717c717ec-config\") pod \"kube-apiserver-proxy-ip-10-0-139-88.ec2.internal\" (UID: \"a22c454fc2ba5924d3a6892717c717ec\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.995020 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.994961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f308e559b92afbe7179caa99acdc8f6c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal\" (UID: \"f308e559b92afbe7179caa99acdc8f6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.995020 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.994966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f308e559b92afbe7179caa99acdc8f6c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal\" (UID: \"f308e559b92afbe7179caa99acdc8f6c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" Apr 16 18:03:25.995020 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:25.994996 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a22c454fc2ba5924d3a6892717c717ec-config\") pod \"kube-apiserver-proxy-ip-10-0-139-88.ec2.internal\" (UID: \"a22c454fc2ba5924d3a6892717c717ec\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal" Apr 16 18:03:26.079092 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.079059 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" Apr 16 18:03:26.082691 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.082678 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal" Apr 16 18:03:26.085695 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:26.085674 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:26.186249 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:26.186164 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:26.286715 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:26.286691 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:26.387312 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:26.387275 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:26.400804 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.400781 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:03:26.400954 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.400939 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:03:26.400995 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.400963 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:03:26.488361 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:26.488283 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:26.488361 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.488294 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:58:25 +0000 UTC" deadline="2027-11-11 15:09:51.157661179 +0000 UTC" Apr 16 18:03:26.488361 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.488323 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13773h6m24.669342034s" Apr 16 18:03:26.491903 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.491883 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:03:26.500959 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.500939 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:03:26.529841 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.529809 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-52thm" Apr 16 18:03:26.541580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.541558 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-52thm" Apr 16 18:03:26.588539 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:26.588506 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:26.655978 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.655821 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:03:26.689600 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:26.689574 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-88.ec2.internal\" not found" Apr 16 18:03:26.699934 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.699912 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:03:26.734115 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:26.734086 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22c454fc2ba5924d3a6892717c717ec.slice/crio-4558fbdee1e63f242053cca38d827376765e3065a8494b1f8a6087c07d93e462 WatchSource:0}: Error finding container 4558fbdee1e63f242053cca38d827376765e3065a8494b1f8a6087c07d93e462: Status 404 returned error can't find the container with id 4558fbdee1e63f242053cca38d827376765e3065a8494b1f8a6087c07d93e462 Apr 16 18:03:26.734434 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:26.734412 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf308e559b92afbe7179caa99acdc8f6c.slice/crio-73c317320b9c29243bf1850cfc6ccb5276ffa5ebb58f5cca7a88d188199dd688 WatchSource:0}: Error finding container 73c317320b9c29243bf1850cfc6ccb5276ffa5ebb58f5cca7a88d188199dd688: Status 404 returned error can't find the container with id 73c317320b9c29243bf1850cfc6ccb5276ffa5ebb58f5cca7a88d188199dd688 Apr 16 18:03:26.738305 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.738291 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:03:26.792962 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.792938 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" Apr 16 18:03:26.808023 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.808006 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:03:26.809268 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.809255 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal" Apr 16 18:03:26.818112 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:26.818096 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:03:27.364675 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.364505 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:03:27.472223 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.472178 2578 apiserver.go:52] "Watching apiserver" Apr 16 18:03:27.477334 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.477306 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:03:27.478131 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.477849 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-t49dd","openshift-network-operator/iptables-alerter-2x6mt","kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2","openshift-cluster-node-tuning-operator/tuned-qdrmx","openshift-dns/node-resolver-hxbd9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal","openshift-network-diagnostics/network-check-target-9qqgk","openshift-ovn-kubernetes/ovnkube-node-2pmhr","kube-system/konnectivity-agent-hhkz9","openshift-image-registry/node-ca-g8588","openshift-multus/multus-additional-cni-plugins-vbbl2","openshift-multus/multus-rx2m9"] Apr 16 18:03:27.479425 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.479404 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:27.480628 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.480611 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.481465 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.481363 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:03:27.481465 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.481401 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:03:27.481465 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.481419 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-pmbd2\"" Apr 16 18:03:27.482749 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.482294 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.482749 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.482710 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:03:27.482749 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.482736 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:03:27.482954 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.482811 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:03:27.482954 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.482738 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kslbq\"" Apr 16 18:03:27.483444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.483418 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.483987 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.483966 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:03:27.484274 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.484149 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:03:27.484274 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.484159 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:03:27.484274 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.484219 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nf4sf\"" Apr 16 18:03:27.484631 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.484608 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:27.484720 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:27.484691 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:27.485346 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.485181 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:03:27.485346 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.485319 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6g7d8\"" Apr 16 18:03:27.485522 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.485361 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:03:27.486061 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.486042 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.487486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.487466 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:27.487600 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:27.487543 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:27.487755 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.487599 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kfgnr\"" Apr 16 18:03:27.487916 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.487897 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:03:27.488016 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.487917 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:03:27.488016 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.487937 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:03:27.488016 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.487918 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:03:27.488167 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.488018 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:03:27.488242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.488234 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:03:27.488969 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.488938 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.490281 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.490262 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.491334 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.491027 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:03:27.491334 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.491053 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-h2kvz\"" Apr 16 18:03:27.491334 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.491085 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:03:27.491334 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.491307 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:03:27.493141 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.492518 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.493141 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.492673 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:03:27.494228 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.494185 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:03:27.494515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.494497 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mkd72\"" Apr 16 18:03:27.494606 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.494507 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:03:27.494839 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.494819 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:03:27.494975 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.494956 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nh8xf\"" Apr 16 18:03:27.495300 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.495282 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:03:27.495382 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.495342 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:03:27.497443 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.497411 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.499411 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.499389 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:03:27.499505 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.499445 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:03:27.499597 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.499562 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-269wx\"" Apr 16 18:03:27.502956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.502938 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-run-multus-certs\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.503047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.502968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-modprobe-d\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.503047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503019 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-slash\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.503047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-etc-openvswitch\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.503175 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-run-ovn-kubernetes\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.503175 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b4ee727-773d-47fd-8e52-976f88918e9d-env-overrides\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.503175 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503111 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-daemon-config\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.503175 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-etc-kubernetes\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.503175 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-socket-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503206 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-device-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503256 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-sys-fs\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503302 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-systemd-units\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/935e60a3-a3a8-4cd1-b842-f7f54efe8cb8-host\") pod \"node-ca-g8588\" (UID: \"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8\") " pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2811e81e-8a8b-4ab5-903d-22cce72663e2-hosts-file\") pod \"node-resolver-hxbd9\" (UID: \"2811e81e-8a8b-4ab5-903d-22cce72663e2\") " pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503419 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-socket-dir-parent\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-host\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-run-openvswitch\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s4jt\" (UniqueName: \"kubernetes.io/projected/935e60a3-a3a8-4cd1-b842-f7f54efe8cb8-kube-api-access-6s4jt\") pod \"node-ca-g8588\" (UID: \"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8\") " pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503539 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.503580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503571 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71ef152c-1129-42dc-8a47-99e2ea30df5b-konnectivity-ca\") pod \"konnectivity-agent-hhkz9\" (UID: \"71ef152c-1129-42dc-8a47-99e2ea30df5b\") " pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-var-lib-cni-multus\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/552c454d-9201-4aa0-bce5-2d6ee55ba1c7-host-slash\") pod \"iptables-alerter-2x6mt\" (UID: \"552c454d-9201-4aa0-bce5-2d6ee55ba1c7\") " pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlpv7\" (UniqueName: \"kubernetes.io/projected/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-kube-api-access-tlpv7\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503686 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-run-k8s-cni-cncf-io\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-sysconfig\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-var-lib-openvswitch\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-log-socket\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-kubernetes\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503822 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-sysctl-d\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503840 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-var-lib-kubelet\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503868 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/227f1e14-5d18-4377-b065-301e80c9a0f5-tmp\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-registration-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-run-systemd\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-run-ovn\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503969 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-system-cni-dir\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.504077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.503994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-os-release\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504027 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-kubelet\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/935e60a3-a3a8-4cd1-b842-f7f54efe8cb8-serviceca\") pod \"node-ca-g8588\" (UID: \"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8\") " pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504075 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tw5k\" (UniqueName: \"kubernetes.io/projected/2811e81e-8a8b-4ab5-903d-22cce72663e2-kube-api-access-2tw5k\") pod \"node-resolver-hxbd9\" (UID: \"2811e81e-8a8b-4ab5-903d-22cce72663e2\") " pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504100 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-var-lib-cni-bin\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-conf-dir\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504209 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-etc-selinux\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504239 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b4ee727-773d-47fd-8e52-976f88918e9d-ovnkube-config\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2bns\" (UniqueName: \"kubernetes.io/projected/6b4ee727-773d-47fd-8e52-976f88918e9d-kube-api-access-g2bns\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504307 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-run\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-cni-netd\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b4ee727-773d-47fd-8e52-976f88918e9d-ovn-node-metrics-cert\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504412 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2811e81e-8a8b-4ab5-903d-22cce72663e2-tmp-dir\") pod \"node-resolver-hxbd9\" (UID: \"2811e81e-8a8b-4ab5-903d-22cce72663e2\") " pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-cnibin\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504465 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-systemd\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-lib-modules\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.504763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/552c454d-9201-4aa0-bce5-2d6ee55ba1c7-iptables-alerter-script\") pod \"iptables-alerter-2x6mt\" (UID: \"552c454d-9201-4aa0-bce5-2d6ee55ba1c7\") " pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504585 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-run-netns\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-cnibin\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504638 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71ef152c-1129-42dc-8a47-99e2ea30df5b-agent-certs\") pod \"konnectivity-agent-hhkz9\" (UID: \"71ef152c-1129-42dc-8a47-99e2ea30df5b\") " pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-sysctl-conf\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fbm7\" (UniqueName: \"kubernetes.io/projected/e79e1f28-6320-4694-b831-a2bd45771d4a-kube-api-access-8fbm7\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504839 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-cni-dir\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2119e250-ea67-47ef-ab06-f2ae21b8044f-cni-binary-copy\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504889 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shvx9\" (UniqueName: \"kubernetes.io/projected/552c454d-9201-4aa0-bce5-2d6ee55ba1c7-kube-api-access-shvx9\") pod \"iptables-alerter-2x6mt\" (UID: \"552c454d-9201-4aa0-bce5-2d6ee55ba1c7\") " pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-cni-bin\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504956 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-cni-binary-copy\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.504982 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-os-release\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505003 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btz89\" (UniqueName: \"kubernetes.io/projected/2119e250-ea67-47ef-ab06-f2ae21b8044f-kube-api-access-btz89\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.505593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-sys\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-tuned\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505082 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-system-cni-dir\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-run-netns\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgdj\" (UniqueName: \"kubernetes.io/projected/227f1e14-5d18-4377-b065-301e80c9a0f5-kube-api-access-ddgdj\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-node-log\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b4ee727-773d-47fd-8e52-976f88918e9d-ovnkube-script-lib\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505262 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505288 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97tpb\" (UniqueName: \"kubernetes.io/projected/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-kube-api-access-97tpb\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505312 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-var-lib-kubelet\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.506337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.505336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-hostroot\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.544955 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.544892 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:58:26 +0000 UTC" deadline="2027-10-05 17:23:24.054299098 +0000 UTC" Apr 16 18:03:27.544955 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.544923 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12887h19m56.509379097s" Apr 16 18:03:27.594032 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.594004 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:03:27.606205 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-host\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.606357 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-run-openvswitch\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.606357 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s4jt\" (UniqueName: \"kubernetes.io/projected/935e60a3-a3a8-4cd1-b842-f7f54efe8cb8-kube-api-access-6s4jt\") pod \"node-ca-g8588\" (UID: \"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8\") " pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.606357 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.606357 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-host\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.606357 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71ef152c-1129-42dc-8a47-99e2ea30df5b-konnectivity-ca\") pod \"konnectivity-agent-hhkz9\" (UID: \"71ef152c-1129-42dc-8a47-99e2ea30df5b\") " pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:27.606357 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-run-openvswitch\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.606654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-var-lib-cni-multus\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.606654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/552c454d-9201-4aa0-bce5-2d6ee55ba1c7-host-slash\") pod \"iptables-alerter-2x6mt\" (UID: \"552c454d-9201-4aa0-bce5-2d6ee55ba1c7\") " pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.606654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-var-lib-cni-multus\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.606654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlpv7\" (UniqueName: \"kubernetes.io/projected/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-kube-api-access-tlpv7\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.606654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-run-k8s-cni-cncf-io\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.606654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/552c454d-9201-4aa0-bce5-2d6ee55ba1c7-host-slash\") pod \"iptables-alerter-2x6mt\" (UID: \"552c454d-9201-4aa0-bce5-2d6ee55ba1c7\") " pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.606654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-sysconfig\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-var-lib-openvswitch\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-run-k8s-cni-cncf-io\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-log-socket\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-kubernetes\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606735 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-sysconfig\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-sysctl-d\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606774 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-var-lib-kubelet\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-log-socket\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606823 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-var-lib-openvswitch\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/227f1e14-5d18-4377-b065-301e80c9a0f5-tmp\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-registration-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.606905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.606883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-run-systemd\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.607481 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607006 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-sysctl-d\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.607481 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71ef152c-1129-42dc-8a47-99e2ea30df5b-konnectivity-ca\") pod \"konnectivity-agent-hhkz9\" (UID: \"71ef152c-1129-42dc-8a47-99e2ea30df5b\") " pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:27.607481 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-kubernetes\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.607481 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-run-systemd\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.607481 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-registration-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.607481 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607403 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:03:27.607481 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607426 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-run-ovn\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.607481 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607441 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-run-ovn\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.607481 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-system-cni-dir\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607506 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-os-release\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607529 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-var-lib-kubelet\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607540 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-kubelet\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/935e60a3-a3a8-4cd1-b842-f7f54efe8cb8-serviceca\") pod \"node-ca-g8588\" (UID: \"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8\") " pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607586 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-system-cni-dir\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tw5k\" (UniqueName: \"kubernetes.io/projected/2811e81e-8a8b-4ab5-903d-22cce72663e2-kube-api-access-2tw5k\") pod \"node-resolver-hxbd9\" (UID: \"2811e81e-8a8b-4ab5-903d-22cce72663e2\") " pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607628 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-var-lib-cni-bin\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-conf-dir\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-etc-selinux\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:27.607691 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607700 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b4ee727-773d-47fd-8e52-976f88918e9d-ovnkube-config\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2bns\" (UniqueName: \"kubernetes.io/projected/6b4ee727-773d-47fd-8e52-976f88918e9d-kube-api-access-g2bns\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607729 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-os-release\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-run\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-var-lib-cni-bin\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.607873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-run\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:27.607810 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs podName:87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.107782922 +0000 UTC m=+3.051395977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs") pod "network-metrics-daemon-t49dd" (UID: "87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607832 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-kubelet\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-cni-netd\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607876 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b4ee727-773d-47fd-8e52-976f88918e9d-ovn-node-metrics-cert\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-etc-selinux\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2811e81e-8a8b-4ab5-903d-22cce72663e2-tmp-dir\") pod \"node-resolver-hxbd9\" (UID: \"2811e81e-8a8b-4ab5-903d-22cce72663e2\") " pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-cnibin\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-systemd\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-lib-modules\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/552c454d-9201-4aa0-bce5-2d6ee55ba1c7-iptables-alerter-script\") pod \"iptables-alerter-2x6mt\" (UID: \"552c454d-9201-4aa0-bce5-2d6ee55ba1c7\") " pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-cnibin\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.607835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-conf-dir\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608079 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-run-netns\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-cni-netd\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.608663 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608448 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6b4ee727-773d-47fd-8e52-976f88918e9d-ovnkube-config\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608472 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-run-netns\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-lib-modules\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2811e81e-8a8b-4ab5-903d-22cce72663e2-tmp-dir\") pod \"node-resolver-hxbd9\" (UID: \"2811e81e-8a8b-4ab5-903d-22cce72663e2\") " pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608918 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-systemd\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-cnibin\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.608989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71ef152c-1129-42dc-8a47-99e2ea30df5b-agent-certs\") pod \"konnectivity-agent-hhkz9\" (UID: \"71ef152c-1129-42dc-8a47-99e2ea30df5b\") " pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-sysctl-conf\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-cnibin\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbm7\" (UniqueName: \"kubernetes.io/projected/e79e1f28-6320-4694-b831-a2bd45771d4a-kube-api-access-8fbm7\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609083 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-sysctl-conf\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/552c454d-9201-4aa0-bce5-2d6ee55ba1c7-iptables-alerter-script\") pod \"iptables-alerter-2x6mt\" (UID: \"552c454d-9201-4aa0-bce5-2d6ee55ba1c7\") " pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609281 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.609453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-cni-dir\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2119e250-ea67-47ef-ab06-f2ae21b8044f-cni-binary-copy\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609500 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shvx9\" (UniqueName: \"kubernetes.io/projected/552c454d-9201-4aa0-bce5-2d6ee55ba1c7-kube-api-access-shvx9\") pod \"iptables-alerter-2x6mt\" (UID: \"552c454d-9201-4aa0-bce5-2d6ee55ba1c7\") " pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609526 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-cni-dir\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-cni-bin\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-cni-binary-copy\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-os-release\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609635 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btz89\" (UniqueName: \"kubernetes.io/projected/2119e250-ea67-47ef-ab06-f2ae21b8044f-kube-api-access-btz89\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-sys\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609708 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-tuned\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-os-release\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609842 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-system-cni-dir\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-system-cni-dir\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.609954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-cni-bin\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.610180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610176 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/935e60a3-a3a8-4cd1-b842-f7f54efe8cb8-serviceca\") pod \"node-ca-g8588\" (UID: \"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8\") " pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-run-netns\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgdj\" (UniqueName: \"kubernetes.io/projected/227f1e14-5d18-4377-b065-301e80c9a0f5-kube-api-access-ddgdj\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-node-log\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b4ee727-773d-47fd-8e52-976f88918e9d-ovnkube-script-lib\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610365 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610393 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97tpb\" (UniqueName: \"kubernetes.io/projected/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-kube-api-access-97tpb\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-var-lib-kubelet\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-hostroot\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-run-multus-certs\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610530 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-modprobe-d\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2119e250-ea67-47ef-ab06-f2ae21b8044f-cni-binary-copy\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610556 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-slash\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610583 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-etc-openvswitch\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-run-ovn-kubernetes\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-cni-binary-copy\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b4ee727-773d-47fd-8e52-976f88918e9d-env-overrides\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.610952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610688 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-daemon-config\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-etc-kubernetes\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-socket-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-device-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610798 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-sys-fs\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-systemd-units\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610869 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/935e60a3-a3a8-4cd1-b842-f7f54efe8cb8-host\") pod \"node-ca-g8588\" (UID: \"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8\") " pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610872 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-sys\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2811e81e-8a8b-4ab5-903d-22cce72663e2-hosts-file\") pod \"node-resolver-hxbd9\" (UID: \"2811e81e-8a8b-4ab5-903d-22cce72663e2\") " pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-socket-dir-parent\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2811e81e-8a8b-4ab5-903d-22cce72663e2-hosts-file\") pod \"node-resolver-hxbd9\" (UID: \"2811e81e-8a8b-4ab5-903d-22cce72663e2\") " pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.610993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611037 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-etc-openvswitch\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611051 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-var-lib-kubelet\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611069 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-slash\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611096 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-socket-dir-parent\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-node-log\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611212 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-host-run-ovn-kubernetes\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.611723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/227f1e14-5d18-4377-b065-301e80c9a0f5-tmp\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-socket-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611480 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-run-multus-certs\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-modprobe-d\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-etc-kubernetes\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611731 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-hostroot\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611829 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6b4ee727-773d-47fd-8e52-976f88918e9d-env-overrides\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6b4ee727-773d-47fd-8e52-976f88918e9d-systemd-units\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611901 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/935e60a3-a3a8-4cd1-b842-f7f54efe8cb8-host\") pod \"node-ca-g8588\" (UID: \"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8\") " pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611920 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-device-dir\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611942 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2119e250-ea67-47ef-ab06-f2ae21b8044f-host-run-netns\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.611934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e79e1f28-6320-4694-b831-a2bd45771d4a-sys-fs\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.612034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2119e250-ea67-47ef-ab06-f2ae21b8044f-multus-daemon-config\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.612494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.612125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6b4ee727-773d-47fd-8e52-976f88918e9d-ovnkube-script-lib\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.613377 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.613323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6b4ee727-773d-47fd-8e52-976f88918e9d-ovn-node-metrics-cert\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.613493 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.613474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/227f1e14-5d18-4377-b065-301e80c9a0f5-etc-tuned\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.613953 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.613931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71ef152c-1129-42dc-8a47-99e2ea30df5b-agent-certs\") pod \"konnectivity-agent-hhkz9\" (UID: \"71ef152c-1129-42dc-8a47-99e2ea30df5b\") " pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:27.616530 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.616475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlpv7\" (UniqueName: \"kubernetes.io/projected/2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77-kube-api-access-tlpv7\") pod \"multus-additional-cni-plugins-vbbl2\" (UID: \"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77\") " pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.619219 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.618437 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgdj\" (UniqueName: \"kubernetes.io/projected/227f1e14-5d18-4377-b065-301e80c9a0f5-kube-api-access-ddgdj\") pod \"tuned-qdrmx\" (UID: \"227f1e14-5d18-4377-b065-301e80c9a0f5\") " pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:27.619219 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:27.618583 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:27.619219 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:27.618607 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:27.619219 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:27.618619 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r9rpn for pod openshift-network-diagnostics/network-check-target-9qqgk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:27.619219 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:27.618681 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn podName:4af28085-16ca-4a81-b155-9c85f1f05a68 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:28.11866278 +0000 UTC m=+3.062275828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r9rpn" (UniqueName: "kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn") pod "network-check-target-9qqgk" (UID: "4af28085-16ca-4a81-b155-9c85f1f05a68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:27.619969 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.619836 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btz89\" (UniqueName: \"kubernetes.io/projected/2119e250-ea67-47ef-ab06-f2ae21b8044f-kube-api-access-btz89\") pod \"multus-rx2m9\" (UID: \"2119e250-ea67-47ef-ab06-f2ae21b8044f\") " pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.619969 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.619932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2bns\" (UniqueName: \"kubernetes.io/projected/6b4ee727-773d-47fd-8e52-976f88918e9d-kube-api-access-g2bns\") pod \"ovnkube-node-2pmhr\" (UID: \"6b4ee727-773d-47fd-8e52-976f88918e9d\") " pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.620619 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.620429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shvx9\" (UniqueName: \"kubernetes.io/projected/552c454d-9201-4aa0-bce5-2d6ee55ba1c7-kube-api-access-shvx9\") pod \"iptables-alerter-2x6mt\" (UID: \"552c454d-9201-4aa0-bce5-2d6ee55ba1c7\") " pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.621698 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.621673 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s4jt\" (UniqueName: \"kubernetes.io/projected/935e60a3-a3a8-4cd1-b842-f7f54efe8cb8-kube-api-access-6s4jt\") pod \"node-ca-g8588\" (UID: \"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8\") " pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.622181 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.622156 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97tpb\" (UniqueName: \"kubernetes.io/projected/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-kube-api-access-97tpb\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:27.623416 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.623373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tw5k\" (UniqueName: \"kubernetes.io/projected/2811e81e-8a8b-4ab5-903d-22cce72663e2-kube-api-access-2tw5k\") pod \"node-resolver-hxbd9\" (UID: \"2811e81e-8a8b-4ab5-903d-22cce72663e2\") " pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.624618 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.624599 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fbm7\" (UniqueName: \"kubernetes.io/projected/e79e1f28-6320-4694-b831-a2bd45771d4a-kube-api-access-8fbm7\") pod \"aws-ebs-csi-driver-node-6tdp2\" (UID: \"e79e1f28-6320-4694-b831-a2bd45771d4a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.648087 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.648010 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal" event={"ID":"a22c454fc2ba5924d3a6892717c717ec","Type":"ContainerStarted","Data":"4558fbdee1e63f242053cca38d827376765e3065a8494b1f8a6087c07d93e462"} Apr 16 18:03:27.649351 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.649312 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" event={"ID":"f308e559b92afbe7179caa99acdc8f6c","Type":"ContainerStarted","Data":"73c317320b9c29243bf1850cfc6ccb5276ffa5ebb58f5cca7a88d188199dd688"} Apr 16 18:03:27.675405 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.675378 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:03:27.794060 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.794023 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:27.802906 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.802882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2x6mt" Apr 16 18:03:27.812712 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.812678 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" Apr 16 18:03:27.818639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.818614 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hxbd9" Apr 16 18:03:27.826276 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.826255 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:27.834898 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.834877 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-g8588" Apr 16 18:03:27.842537 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.842517 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" Apr 16 18:03:27.850175 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.850148 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rx2m9" Apr 16 18:03:27.856832 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:27.856800 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" Apr 16 18:03:28.114050 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.113925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:28.114050 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:28.114038 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:28.114294 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:28.114112 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs podName:87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:29.114093399 +0000 UTC m=+4.057706437 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs") pod "network-metrics-daemon-t49dd" (UID: "87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:28.214861 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.214828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:28.215018 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:28.215003 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:28.215055 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:28.215022 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:28.215055 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:28.215031 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r9rpn for pod openshift-network-diagnostics/network-check-target-9qqgk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:28.215132 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:28.215093 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn podName:4af28085-16ca-4a81-b155-9c85f1f05a68 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:29.215076878 +0000 UTC m=+4.158689916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9rpn" (UniqueName: "kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn") pod "network-check-target-9qqgk" (UID: "4af28085-16ca-4a81-b155-9c85f1f05a68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:28.450905 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:28.450872 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2119e250_ea67_47ef_ab06_f2ae21b8044f.slice/crio-40f3186dd757635a98c856948dafe9f261586dc98427545d2bdfa2c76309ec08 WatchSource:0}: Error finding container 40f3186dd757635a98c856948dafe9f261586dc98427545d2bdfa2c76309ec08: Status 404 returned error can't find the container with id 40f3186dd757635a98c856948dafe9f261586dc98427545d2bdfa2c76309ec08 Apr 16 18:03:28.467133 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:28.467107 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa8dcfb_ecd1_405d_aadb_af7f9cd89d77.slice/crio-ce7b13cc9c8e4764fd2efbcfddfb69c9139c2ced4be811b3e8b7ccd344a00303 WatchSource:0}: Error finding container ce7b13cc9c8e4764fd2efbcfddfb69c9139c2ced4be811b3e8b7ccd344a00303: Status 404 returned error can't find the container with id ce7b13cc9c8e4764fd2efbcfddfb69c9139c2ced4be811b3e8b7ccd344a00303 Apr 16 18:03:28.467887 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:28.467865 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode79e1f28_6320_4694_b831_a2bd45771d4a.slice/crio-0d4005c90f953ecd0e31f631facd6e0a9cf46a59ef6e48e869eec8fd495eabbe WatchSource:0}: Error finding container 0d4005c90f953ecd0e31f631facd6e0a9cf46a59ef6e48e869eec8fd495eabbe: Status 404 returned error can't find the container with id 0d4005c90f953ecd0e31f631facd6e0a9cf46a59ef6e48e869eec8fd495eabbe Apr 16 18:03:28.468530 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:28.468430 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2811e81e_8a8b_4ab5_903d_22cce72663e2.slice/crio-59d659449a210c71bc741d3163e3e7d4f0fd99c1ee768983dcd3ce9478a8b236 WatchSource:0}: Error finding container 59d659449a210c71bc741d3163e3e7d4f0fd99c1ee768983dcd3ce9478a8b236: Status 404 returned error can't find the container with id 59d659449a210c71bc741d3163e3e7d4f0fd99c1ee768983dcd3ce9478a8b236 Apr 16 18:03:28.472680 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:28.472447 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod227f1e14_5d18_4377_b065_301e80c9a0f5.slice/crio-2ae8676f43c4b7175c30f4990b26f66d7993e9a9ef2cb4684fa30c6c5172569c WatchSource:0}: Error finding container 2ae8676f43c4b7175c30f4990b26f66d7993e9a9ef2cb4684fa30c6c5172569c: Status 404 returned error can't find the container with id 2ae8676f43c4b7175c30f4990b26f66d7993e9a9ef2cb4684fa30c6c5172569c Apr 16 18:03:28.474140 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:28.474119 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4ee727_773d_47fd_8e52_976f88918e9d.slice/crio-8c7df1ec3ef8d9fae0c6adcdc95660edfb2e330d3cc0098c9fe0fbd85d70b92e WatchSource:0}: Error finding container 8c7df1ec3ef8d9fae0c6adcdc95660edfb2e330d3cc0098c9fe0fbd85d70b92e: Status 404 returned error can't find the container with id 8c7df1ec3ef8d9fae0c6adcdc95660edfb2e330d3cc0098c9fe0fbd85d70b92e Apr 16 18:03:28.474906 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:28.474803 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod935e60a3_a3a8_4cd1_b842_f7f54efe8cb8.slice/crio-691e0618e20a96df5fd7cb2c15a331a37b075a6591ca3808be95e0d6d8bb2444 WatchSource:0}: Error finding container 691e0618e20a96df5fd7cb2c15a331a37b075a6591ca3808be95e0d6d8bb2444: Status 404 returned error can't find the container with id 691e0618e20a96df5fd7cb2c15a331a37b075a6591ca3808be95e0d6d8bb2444 Apr 16 18:03:28.475842 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:28.475820 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71ef152c_1129_42dc_8a47_99e2ea30df5b.slice/crio-4e0235b1c3a6e5d8e414e7faaeede24fc79a8164835a5dadc9cbfe976319284f WatchSource:0}: Error finding container 4e0235b1c3a6e5d8e414e7faaeede24fc79a8164835a5dadc9cbfe976319284f: Status 404 returned error can't find the container with id 4e0235b1c3a6e5d8e414e7faaeede24fc79a8164835a5dadc9cbfe976319284f Apr 16 18:03:28.476621 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:03:28.476579 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod552c454d_9201_4aa0_bce5_2d6ee55ba1c7.slice/crio-8823b5f9fea15cfc2dba613b54d775a3159c0de84af1e2af78d7aa02eb178956 WatchSource:0}: Error finding container 8823b5f9fea15cfc2dba613b54d775a3159c0de84af1e2af78d7aa02eb178956: Status 404 returned error can't find the container with id 8823b5f9fea15cfc2dba613b54d775a3159c0de84af1e2af78d7aa02eb178956 Apr 16 18:03:28.545275 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.545247 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:58:26 +0000 UTC" deadline="2027-10-12 06:30:00.445674184 +0000 UTC" Apr 16 18:03:28.545275 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.545272 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13044h26m31.900404713s" Apr 16 18:03:28.643963 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.643937 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:28.644102 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:28.644044 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:28.652147 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.652117 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal" event={"ID":"a22c454fc2ba5924d3a6892717c717ec","Type":"ContainerStarted","Data":"fc3d09c862ce097d8fa590846ea7b50637be9311fa86e4712ad0fd80ea59f484"} Apr 16 18:03:28.653054 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.653031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2x6mt" event={"ID":"552c454d-9201-4aa0-bce5-2d6ee55ba1c7","Type":"ContainerStarted","Data":"8823b5f9fea15cfc2dba613b54d775a3159c0de84af1e2af78d7aa02eb178956"} Apr 16 18:03:28.653937 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.653913 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hhkz9" event={"ID":"71ef152c-1129-42dc-8a47-99e2ea30df5b","Type":"ContainerStarted","Data":"4e0235b1c3a6e5d8e414e7faaeede24fc79a8164835a5dadc9cbfe976319284f"} Apr 16 18:03:28.654984 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.654958 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g8588" event={"ID":"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8","Type":"ContainerStarted","Data":"691e0618e20a96df5fd7cb2c15a331a37b075a6591ca3808be95e0d6d8bb2444"} Apr 16 18:03:28.655983 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.655960 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" event={"ID":"6b4ee727-773d-47fd-8e52-976f88918e9d","Type":"ContainerStarted","Data":"8c7df1ec3ef8d9fae0c6adcdc95660edfb2e330d3cc0098c9fe0fbd85d70b92e"} Apr 16 18:03:28.656809 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.656791 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" event={"ID":"227f1e14-5d18-4377-b065-301e80c9a0f5","Type":"ContainerStarted","Data":"2ae8676f43c4b7175c30f4990b26f66d7993e9a9ef2cb4684fa30c6c5172569c"} Apr 16 18:03:28.657686 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.657669 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerStarted","Data":"ce7b13cc9c8e4764fd2efbcfddfb69c9139c2ced4be811b3e8b7ccd344a00303"} Apr 16 18:03:28.658596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.658577 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hxbd9" event={"ID":"2811e81e-8a8b-4ab5-903d-22cce72663e2","Type":"ContainerStarted","Data":"59d659449a210c71bc741d3163e3e7d4f0fd99c1ee768983dcd3ce9478a8b236"} Apr 16 18:03:28.659439 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.659421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" event={"ID":"e79e1f28-6320-4694-b831-a2bd45771d4a","Type":"ContainerStarted","Data":"0d4005c90f953ecd0e31f631facd6e0a9cf46a59ef6e48e869eec8fd495eabbe"} Apr 16 18:03:28.660270 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.660248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rx2m9" event={"ID":"2119e250-ea67-47ef-ab06-f2ae21b8044f","Type":"ContainerStarted","Data":"40f3186dd757635a98c856948dafe9f261586dc98427545d2bdfa2c76309ec08"} Apr 16 18:03:28.668968 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:28.668915 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-88.ec2.internal" podStartSLOduration=2.6689047219999997 podStartE2EDuration="2.668904722s" podCreationTimestamp="2026-04-16 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:28.668740046 +0000 UTC m=+3.612353120" watchObservedRunningTime="2026-04-16 18:03:28.668904722 +0000 UTC m=+3.612517778" Apr 16 18:03:29.122155 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:29.122117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:29.122376 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:29.122315 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:29.122478 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:29.122382 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs podName:87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:31.122362919 +0000 UTC m=+6.065975968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs") pod "network-metrics-daemon-t49dd" (UID: "87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:29.223208 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:29.223166 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:29.223381 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:29.223342 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:29.223381 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:29.223366 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:29.223381 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:29.223379 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r9rpn for pod openshift-network-diagnostics/network-check-target-9qqgk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:29.223554 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:29.223436 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn podName:4af28085-16ca-4a81-b155-9c85f1f05a68 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:31.223417834 +0000 UTC m=+6.167030892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9rpn" (UniqueName: "kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn") pod "network-check-target-9qqgk" (UID: "4af28085-16ca-4a81-b155-9c85f1f05a68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:29.647072 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:29.646512 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:29.647072 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:29.646650 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:29.686834 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:29.686794 2578 generic.go:358] "Generic (PLEG): container finished" podID="f308e559b92afbe7179caa99acdc8f6c" containerID="051ed512f5ced968d729a19379f2dfd5df9c17c0d51cd3e6a041c86f18221096" exitCode=0 Apr 16 18:03:29.688017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:29.687764 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" event={"ID":"f308e559b92afbe7179caa99acdc8f6c","Type":"ContainerDied","Data":"051ed512f5ced968d729a19379f2dfd5df9c17c0d51cd3e6a041c86f18221096"} Apr 16 18:03:30.644101 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:30.644068 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:30.644314 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:30.644216 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:30.724557 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:30.723833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" event={"ID":"f308e559b92afbe7179caa99acdc8f6c","Type":"ContainerStarted","Data":"1b9c446620747a246c52bdb7fbfd4939f51f2f389756788627f049295315ebfb"} Apr 16 18:03:31.140596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:31.139937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:31.140596 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:31.140099 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:31.140596 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:31.140159 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs podName:87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:35.140140941 +0000 UTC m=+10.083753988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs") pod "network-metrics-daemon-t49dd" (UID: "87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:31.240993 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:31.240354 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:31.240993 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:31.240530 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:31.240993 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:31.240549 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:31.240993 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:31.240562 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r9rpn for pod openshift-network-diagnostics/network-check-target-9qqgk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:31.240993 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:31.240621 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn podName:4af28085-16ca-4a81-b155-9c85f1f05a68 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:35.240601449 +0000 UTC m=+10.184214483 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9rpn" (UniqueName: "kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn") pod "network-check-target-9qqgk" (UID: "4af28085-16ca-4a81-b155-9c85f1f05a68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:31.644447 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:31.643912 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:31.644447 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:31.644061 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:32.643602 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:32.643557 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:32.644090 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:32.643700 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:33.643918 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:33.643883 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:33.644378 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:33.644029 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:34.644132 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:34.644069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:34.644594 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:34.644228 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:35.174840 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:35.174795 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:35.175030 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:35.174950 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:35.175030 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:35.175024 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs podName:87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:43.175001126 +0000 UTC m=+18.118614174 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs") pod "network-metrics-daemon-t49dd" (UID: "87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:35.276149 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:35.275944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:35.276343 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:35.276166 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:35.276343 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:35.276185 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:35.276343 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:35.276210 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r9rpn for pod openshift-network-diagnostics/network-check-target-9qqgk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:35.276343 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:35.276274 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn podName:4af28085-16ca-4a81-b155-9c85f1f05a68 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:43.276254333 +0000 UTC m=+18.219867386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9rpn" (UniqueName: "kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn") pod "network-check-target-9qqgk" (UID: "4af28085-16ca-4a81-b155-9c85f1f05a68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:35.645509 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:35.644964 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:35.645509 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:35.645098 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:36.644341 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:36.644077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:36.644518 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:36.644444 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:37.646815 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:37.646789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:37.647271 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:37.646898 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:38.644381 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:38.644346 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:38.644568 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:38.644487 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:39.645858 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:39.645833 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:39.646270 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:39.645953 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:40.643421 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:40.643379 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:40.643609 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:40.643540 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:41.644202 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:41.644168 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:41.644649 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:41.644298 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:42.644138 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:42.644108 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:42.644291 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:42.644225 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:43.232251 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:43.232200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:43.232431 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:43.232357 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:43.232431 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:43.232428 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs podName:87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:59.232410978 +0000 UTC m=+34.176024015 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs") pod "network-metrics-daemon-t49dd" (UID: "87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:43.332976 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:43.332938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:43.333142 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:43.333128 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:43.333234 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:43.333149 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:43.333234 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:43.333159 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r9rpn for pod openshift-network-diagnostics/network-check-target-9qqgk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:43.333234 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:43.333225 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn podName:4af28085-16ca-4a81-b155-9c85f1f05a68 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:59.333208979 +0000 UTC m=+34.276822013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9rpn" (UniqueName: "kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn") pod "network-check-target-9qqgk" (UID: "4af28085-16ca-4a81-b155-9c85f1f05a68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:43.646829 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:43.646761 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:43.647231 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:43.646900 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:44.644035 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:44.643998 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:44.644237 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:44.644119 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:45.647524 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.647392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:45.647995 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:45.647597 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:45.752703 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.752670 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hhkz9" event={"ID":"71ef152c-1129-42dc-8a47-99e2ea30df5b","Type":"ContainerStarted","Data":"91af6a98d57bf3f94c44bb8d71bb523dba75c58d677b1bef0260db218c224a2a"} Apr 16 18:03:45.754478 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.754449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-g8588" event={"ID":"935e60a3-a3a8-4cd1-b842-f7f54efe8cb8","Type":"ContainerStarted","Data":"be6fc733383e1f73f4e999b27425730ae61b44446e32e31a0d21ab11aa554bff"} Apr 16 18:03:45.756843 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.756813 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" event={"ID":"6b4ee727-773d-47fd-8e52-976f88918e9d","Type":"ContainerStarted","Data":"646513170b78c4d261da96ecf102e39b7feec6e2f883147f320cfb908bdaa8ba"} Apr 16 18:03:45.756943 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.756848 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" event={"ID":"6b4ee727-773d-47fd-8e52-976f88918e9d","Type":"ContainerStarted","Data":"1646181d99fd26621cbc2c83020ab7e033f4e0fe4580f9bd8063c57467edfbd1"} Apr 16 18:03:45.756943 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.756864 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" event={"ID":"6b4ee727-773d-47fd-8e52-976f88918e9d","Type":"ContainerStarted","Data":"cb2bbb36ed73a13576245f9b608d6d29da799a6d1739e85e64af4d55904d3859"} Apr 16 18:03:45.756943 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.756876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" event={"ID":"6b4ee727-773d-47fd-8e52-976f88918e9d","Type":"ContainerStarted","Data":"ed7fc0c7986ddbb8d20f0aba10db62be791ff98cb85a9c2c569edc36d62f1c61"} Apr 16 18:03:45.758084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.758065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" event={"ID":"227f1e14-5d18-4377-b065-301e80c9a0f5","Type":"ContainerStarted","Data":"6a673b276adc02666f18b0c73d6f70e9d6c1e0f89ad1585b5257b3a02dd040c5"} Apr 16 18:03:45.759344 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.759316 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerStarted","Data":"3162fbc593925b58c2cd049f7a2b4f9803a6e804e8acd6dd3441e976fc9e5255"} Apr 16 18:03:45.760719 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.760689 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hxbd9" event={"ID":"2811e81e-8a8b-4ab5-903d-22cce72663e2","Type":"ContainerStarted","Data":"2e20d115c706cb530be7f6e28e4cbcee815218a05e6f8a3ac8b82e594165aefc"} Apr 16 18:03:45.761941 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.761915 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" event={"ID":"e79e1f28-6320-4694-b831-a2bd45771d4a","Type":"ContainerStarted","Data":"9af379aa9fa6982331b1ae829d78339598cbc4169345aa7fb57450f893dacac5"} Apr 16 18:03:45.763111 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.763092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rx2m9" event={"ID":"2119e250-ea67-47ef-ab06-f2ae21b8044f","Type":"ContainerStarted","Data":"c070001884dedbaf019b97b9cb551ea7ef20e4b4d7dd68e9d2c7d1e10436204f"} Apr 16 18:03:45.767130 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.767092 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hhkz9" podStartSLOduration=8.476781923 podStartE2EDuration="20.767061497s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:03:28.477763268 +0000 UTC m=+3.421376317" lastFinishedPulling="2026-04-16 18:03:40.768042842 +0000 UTC m=+15.711655891" observedRunningTime="2026-04-16 18:03:45.766592358 +0000 UTC m=+20.710205413" watchObservedRunningTime="2026-04-16 18:03:45.767061497 +0000 UTC m=+20.710674535" Apr 16 18:03:45.767785 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.767755 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-88.ec2.internal" podStartSLOduration=19.767745775 podStartE2EDuration="19.767745775s" podCreationTimestamp="2026-04-16 18:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:03:30.740595456 +0000 UTC m=+5.684208512" watchObservedRunningTime="2026-04-16 18:03:45.767745775 +0000 UTC m=+20.711358832" Apr 16 18:03:45.783257 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.783173 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hxbd9" podStartSLOduration=3.935900143 podStartE2EDuration="20.783160437s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:03:28.471120336 +0000 UTC m=+3.414733370" lastFinishedPulling="2026-04-16 18:03:45.318380625 +0000 UTC m=+20.261993664" observedRunningTime="2026-04-16 18:03:45.783018142 +0000 UTC m=+20.726631200" watchObservedRunningTime="2026-04-16 18:03:45.783160437 +0000 UTC m=+20.726773497" Apr 16 18:03:45.796941 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.796894 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-g8588" podStartSLOduration=3.955488003 podStartE2EDuration="20.796882529s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:03:28.476955208 +0000 UTC m=+3.420568243" lastFinishedPulling="2026-04-16 18:03:45.318349734 +0000 UTC m=+20.261962769" observedRunningTime="2026-04-16 18:03:45.796775471 +0000 UTC m=+20.740388527" watchObservedRunningTime="2026-04-16 18:03:45.796882529 +0000 UTC m=+20.740495585" Apr 16 18:03:45.835677 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.835621 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rx2m9" podStartSLOduration=3.94600302 podStartE2EDuration="20.835600801s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:03:28.465523631 +0000 UTC m=+3.409136664" lastFinishedPulling="2026-04-16 18:03:45.35512141 +0000 UTC m=+20.298734445" observedRunningTime="2026-04-16 18:03:45.81394517 +0000 UTC m=+20.757558245" watchObservedRunningTime="2026-04-16 18:03:45.835600801 +0000 UTC m=+20.779213858" Apr 16 18:03:45.855473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:45.855432 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qdrmx" podStartSLOduration=3.01124269 podStartE2EDuration="19.855417855s" podCreationTimestamp="2026-04-16 18:03:26 +0000 UTC" firstStartedPulling="2026-04-16 18:03:28.47413539 +0000 UTC m=+3.417748438" lastFinishedPulling="2026-04-16 18:03:45.318310557 +0000 UTC m=+20.261923603" observedRunningTime="2026-04-16 18:03:45.855345071 +0000 UTC m=+20.798958127" watchObservedRunningTime="2026-04-16 18:03:45.855417855 +0000 UTC m=+20.799030911" Apr 16 18:03:46.544801 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.544640 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:03:46.566099 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.566020 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:03:46.544796739Z","UUID":"55ed808e-9d5c-4de1-a227-4a3b4037afe7","Handler":null,"Name":"","Endpoint":""} Apr 16 18:03:46.568483 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.568461 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:03:46.568575 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.568490 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:03:46.643586 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.643560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:46.643725 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:46.643676 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:46.766818 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.766738 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" event={"ID":"e79e1f28-6320-4694-b831-a2bd45771d4a","Type":"ContainerStarted","Data":"81b0d976e150d365fadabc50847d8494618dd5b18a3903f3650873a5290aa558"} Apr 16 18:03:46.767931 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.767904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2x6mt" event={"ID":"552c454d-9201-4aa0-bce5-2d6ee55ba1c7","Type":"ContainerStarted","Data":"8b534bc31661cf19ce09b0e84dc01ad1b0ade96ad344d7d757dbcb39ee346b58"} Apr 16 18:03:46.772013 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.771990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" event={"ID":"6b4ee727-773d-47fd-8e52-976f88918e9d","Type":"ContainerStarted","Data":"d60fdf4f5d14d73e31a59a177e87d7c5fb1a684dd6d7a1d9bb6f81aec122d984"} Apr 16 18:03:46.772104 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.772019 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" event={"ID":"6b4ee727-773d-47fd-8e52-976f88918e9d","Type":"ContainerStarted","Data":"f5033f82ea11a328cca1ece84a31e8d58ff5e19152291d0e76bfaaaa91d1b448"} Apr 16 18:03:46.773386 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.773362 2578 generic.go:358] "Generic (PLEG): container finished" podID="2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77" containerID="3162fbc593925b58c2cd049f7a2b4f9803a6e804e8acd6dd3441e976fc9e5255" exitCode=0 Apr 16 18:03:46.773468 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.773450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerDied","Data":"3162fbc593925b58c2cd049f7a2b4f9803a6e804e8acd6dd3441e976fc9e5255"} Apr 16 18:03:46.807617 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:46.807577 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2x6mt" podStartSLOduration=4.968017887 podStartE2EDuration="21.807563318s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:03:28.478764265 +0000 UTC m=+3.422377298" lastFinishedPulling="2026-04-16 18:03:45.318309692 +0000 UTC m=+20.261922729" observedRunningTime="2026-04-16 18:03:46.783740345 +0000 UTC m=+21.727353401" watchObservedRunningTime="2026-04-16 18:03:46.807563318 +0000 UTC m=+21.751176419" Apr 16 18:03:47.643585 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:47.643555 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:47.643874 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:47.643703 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:48.643755 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:48.643703 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:48.644452 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:48.643832 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:48.781125 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:48.781090 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" event={"ID":"6b4ee727-773d-47fd-8e52-976f88918e9d","Type":"ContainerStarted","Data":"7a38ae42d45dc58af8f0c6f106b558743584d718c155aa5c945f39b28b873237"} Apr 16 18:03:49.370708 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:49.370626 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:49.371442 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:49.371420 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:49.646801 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:49.646737 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:49.647176 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:49.646862 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:49.785384 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:49.785356 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" event={"ID":"e79e1f28-6320-4694-b831-a2bd45771d4a","Type":"ContainerStarted","Data":"bc6f79418da50a6aa677dfd333e155a047c819e36dc02a1bd9dd7155d40b7501"} Apr 16 18:03:49.785594 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:49.785577 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:49.786151 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:49.786132 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hhkz9" Apr 16 18:03:49.803527 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:49.803484 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6tdp2" podStartSLOduration=4.321611808 podStartE2EDuration="24.803472612s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:03:28.470214596 +0000 UTC m=+3.413827633" lastFinishedPulling="2026-04-16 18:03:48.9520754 +0000 UTC m=+23.895688437" observedRunningTime="2026-04-16 18:03:49.803461495 +0000 UTC m=+24.747074550" watchObservedRunningTime="2026-04-16 18:03:49.803472612 +0000 UTC m=+24.747085668" Apr 16 18:03:50.644398 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:50.644362 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:50.644557 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:50.644492 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:51.646256 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:51.646049 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:51.646672 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:51.646342 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:51.789652 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:51.789604 2578 generic.go:358] "Generic (PLEG): container finished" podID="2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77" containerID="bfd63880534724e9ca3e08dcbdc6041649ed11df92975d743e5f83374ea5dc50" exitCode=0 Apr 16 18:03:51.789824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:51.789692 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerDied","Data":"bfd63880534724e9ca3e08dcbdc6041649ed11df92975d743e5f83374ea5dc50"} Apr 16 18:03:51.792980 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:51.792957 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" event={"ID":"6b4ee727-773d-47fd-8e52-976f88918e9d","Type":"ContainerStarted","Data":"3082857e848ec1123bb38d7639f9163274b324685dcdfd7b8532b9fa2f1e4efc"} Apr 16 18:03:51.844893 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:51.844843 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" podStartSLOduration=9.885439041 podStartE2EDuration="26.844828826s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:03:28.475997031 +0000 UTC m=+3.419610074" lastFinishedPulling="2026-04-16 18:03:45.435386823 +0000 UTC m=+20.378999859" observedRunningTime="2026-04-16 18:03:51.844126577 +0000 UTC m=+26.787739650" watchObservedRunningTime="2026-04-16 18:03:51.844828826 +0000 UTC m=+26.788441882" Apr 16 18:03:52.643366 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:52.643332 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:52.643517 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:52.643448 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:52.794483 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:52.794458 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:03:52.794853 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:52.794791 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:52.794853 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:52.794815 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:52.808909 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:52.808888 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:52.809035 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:52.808961 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:53.056990 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:53.056966 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:03:53.646341 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:53.646316 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:53.646499 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:53.646413 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:53.798309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:53.798274 2578 generic.go:358] "Generic (PLEG): container finished" podID="2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77" containerID="5e8ca235c5afba870757f79269bf6ffec6ee05ba68d7b4025152598a57aa3de3" exitCode=0 Apr 16 18:03:53.798674 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:53.798351 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerDied","Data":"5e8ca235c5afba870757f79269bf6ffec6ee05ba68d7b4025152598a57aa3de3"} Apr 16 18:03:54.643656 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:54.643623 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:54.643822 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:54.643739 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:55.644433 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:55.644269 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:55.644787 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:55.644493 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:55.803997 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:55.803962 2578 generic.go:358] "Generic (PLEG): container finished" podID="2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77" containerID="ef2031930844eb90381ea5f33d5c4ccee4085ac3fc78e7e07b46c1bfeb5ec6fc" exitCode=0 Apr 16 18:03:55.804136 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:55.804013 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerDied","Data":"ef2031930844eb90381ea5f33d5c4ccee4085ac3fc78e7e07b46c1bfeb5ec6fc"} Apr 16 18:03:56.643366 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:56.643335 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:56.643524 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:56.643463 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:57.647137 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:57.647104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:57.647580 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:57.647242 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:03:58.643767 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:58.643694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:58.643965 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:58.643837 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:03:59.247162 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:59.247125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:59.247732 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:59.247295 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:59.247732 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:59.247381 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs podName:87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:31.247357918 +0000 UTC m=+66.190970967 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs") pod "network-metrics-daemon-t49dd" (UID: "87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:03:59.348447 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:59.348414 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:03:59.348641 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:59.348620 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:03:59.348697 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:59.348648 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:03:59.348697 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:59.348661 2578 projected.go:194] Error preparing data for projected volume kube-api-access-r9rpn for pod openshift-network-diagnostics/network-check-target-9qqgk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:59.348787 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:59.348727 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn podName:4af28085-16ca-4a81-b155-9c85f1f05a68 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:31.34870721 +0000 UTC m=+66.292320261 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-r9rpn" (UniqueName: "kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn") pod "network-check-target-9qqgk" (UID: "4af28085-16ca-4a81-b155-9c85f1f05a68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:03:59.643536 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:03:59.643452 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:03:59.643678 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:03:59.643605 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:04:00.644228 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:00.644177 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:04:00.644661 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:00.644305 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:04:01.644077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:01.643893 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:04:01.644077 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:01.644043 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:04:01.819000 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:01.818966 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerStarted","Data":"0d6fb2142bbd5d5df9517417b86c03978eb1201314c742fac9855c139a73aa7f"} Apr 16 18:04:02.037574 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:02.037543 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t49dd"] Apr 16 18:04:02.037740 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:02.037693 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:04:02.037844 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:02.037810 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:04:02.038134 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:02.038109 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9qqgk"] Apr 16 18:04:02.038279 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:02.038226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:04:02.038356 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:02.038339 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:04:02.823278 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:02.823179 2578 generic.go:358] "Generic (PLEG): container finished" podID="2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77" containerID="0d6fb2142bbd5d5df9517417b86c03978eb1201314c742fac9855c139a73aa7f" exitCode=0 Apr 16 18:04:02.823278 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:02.823220 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerDied","Data":"0d6fb2142bbd5d5df9517417b86c03978eb1201314c742fac9855c139a73aa7f"} Apr 16 18:04:03.646342 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:03.646151 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:04:03.646477 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:03.646151 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:04:03.646477 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:03.646423 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:04:03.646563 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:03.646523 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:04:03.827312 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:03.827282 2578 generic.go:358] "Generic (PLEG): container finished" podID="2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77" containerID="ba1af7e64afcfabe30e77d86b2509082de559cf037f93173a9022be4e9adeee1" exitCode=0 Apr 16 18:04:03.827656 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:03.827331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerDied","Data":"ba1af7e64afcfabe30e77d86b2509082de559cf037f93173a9022be4e9adeee1"} Apr 16 18:04:04.831397 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:04.831362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" event={"ID":"2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77","Type":"ContainerStarted","Data":"6fdf8d9f56fee87b40b579866b055298d4108710858e2e9f620d479d2281feb3"} Apr 16 18:04:04.857067 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:04.857018 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vbbl2" podStartSLOduration=6.75377799 podStartE2EDuration="39.857005341s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:03:28.470233006 +0000 UTC m=+3.413846041" lastFinishedPulling="2026-04-16 18:04:01.573460342 +0000 UTC m=+36.517073392" observedRunningTime="2026-04-16 18:04:04.856687877 +0000 UTC m=+39.800300932" watchObservedRunningTime="2026-04-16 18:04:04.857005341 +0000 UTC m=+39.800618406" Apr 16 18:04:05.647078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:05.647052 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:04:05.647282 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:05.647054 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:04:05.647282 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:05.647168 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t49dd" podUID="87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0" Apr 16 18:04:05.647282 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:05.647225 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9qqgk" podUID="4af28085-16ca-4a81-b155-9c85f1f05a68" Apr 16 18:04:06.401408 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.401342 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-88.ec2.internal" event="NodeReady" Apr 16 18:04:06.401828 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.401442 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:04:06.446347 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.446321 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7t9nd"] Apr 16 18:04:06.448964 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.448944 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.449094 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.449068 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6dthq"] Apr 16 18:04:06.451162 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.451142 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:04:06.451282 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.451161 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:04:06.451282 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.451144 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:04:06.451282 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.451233 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:04:06.451282 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.451242 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nw878\"" Apr 16 18:04:06.452023 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.452007 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.453611 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.453594 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:04:06.453803 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.453788 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzqks\"" Apr 16 18:04:06.453889 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.453827 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:04:06.461710 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.461691 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6dthq"] Apr 16 18:04:06.465645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.465629 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7t9nd"] Apr 16 18:04:06.556201 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.556171 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-79ghb"] Apr 16 18:04:06.559054 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.559035 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-79ghb" Apr 16 18:04:06.561380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.561361 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:04:06.561521 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.561458 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fkw8n\"" Apr 16 18:04:06.561521 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.561488 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:04:06.561808 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.561793 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:04:06.570551 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.570531 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-79ghb"] Apr 16 18:04:06.604536 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.604516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1ab75e1c-1aff-4d37-a06b-03b4f389061d-crio-socket\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.604652 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.604562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56ss\" (UniqueName: \"kubernetes.io/projected/1ab75e1c-1aff-4d37-a06b-03b4f389061d-kube-api-access-m56ss\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.604652 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.604621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1164c58-e91c-4b0c-93e5-28d0244988b6-metrics-tls\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.604724 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.604654 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1164c58-e91c-4b0c-93e5-28d0244988b6-config-volume\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.604724 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.604677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1164c58-e91c-4b0c-93e5-28d0244988b6-tmp-dir\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.604724 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.604698 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1ab75e1c-1aff-4d37-a06b-03b4f389061d-data-volume\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.604724 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.604713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8f74\" (UniqueName: \"kubernetes.io/projected/b1164c58-e91c-4b0c-93e5-28d0244988b6-kube-api-access-q8f74\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.604864 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.604734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1ab75e1c-1aff-4d37-a06b-03b4f389061d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.604864 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.604771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1ab75e1c-1aff-4d37-a06b-03b4f389061d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.705682 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m56ss\" (UniqueName: \"kubernetes.io/projected/1ab75e1c-1aff-4d37-a06b-03b4f389061d-kube-api-access-m56ss\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.705833 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1164c58-e91c-4b0c-93e5-28d0244988b6-metrics-tls\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.705833 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1164c58-e91c-4b0c-93e5-28d0244988b6-config-volume\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.705833 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1164c58-e91c-4b0c-93e5-28d0244988b6-tmp-dir\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.705833 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1ab75e1c-1aff-4d37-a06b-03b4f389061d-data-volume\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.706008 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8f74\" (UniqueName: \"kubernetes.io/projected/b1164c58-e91c-4b0c-93e5-28d0244988b6-kube-api-access-q8f74\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.706008 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1ab75e1c-1aff-4d37-a06b-03b4f389061d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.706008 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1ab75e1c-1aff-4d37-a06b-03b4f389061d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.706008 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705919 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c118da2f-c36c-4fca-859c-34ed40076370-cert\") pod \"ingress-canary-79ghb\" (UID: \"c118da2f-c36c-4fca-859c-34ed40076370\") " pod="openshift-ingress-canary/ingress-canary-79ghb" Apr 16 18:04:06.706008 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb8hm\" (UniqueName: \"kubernetes.io/projected/c118da2f-c36c-4fca-859c-34ed40076370-kube-api-access-gb8hm\") pod \"ingress-canary-79ghb\" (UID: \"c118da2f-c36c-4fca-859c-34ed40076370\") " pod="openshift-ingress-canary/ingress-canary-79ghb" Apr 16 18:04:06.706008 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.705978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1ab75e1c-1aff-4d37-a06b-03b4f389061d-crio-socket\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.706307 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.706248 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b1164c58-e91c-4b0c-93e5-28d0244988b6-tmp-dir\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.706307 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.706275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1ab75e1c-1aff-4d37-a06b-03b4f389061d-data-volume\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.706380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.706330 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1ab75e1c-1aff-4d37-a06b-03b4f389061d-crio-socket\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.706413 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.706392 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1164c58-e91c-4b0c-93e5-28d0244988b6-config-volume\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.706483 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.706465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1ab75e1c-1aff-4d37-a06b-03b4f389061d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.709986 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.709963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b1164c58-e91c-4b0c-93e5-28d0244988b6-metrics-tls\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.710078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.710010 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1ab75e1c-1aff-4d37-a06b-03b4f389061d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.713126 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.713107 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8f74\" (UniqueName: \"kubernetes.io/projected/b1164c58-e91c-4b0c-93e5-28d0244988b6-kube-api-access-q8f74\") pod \"dns-default-6dthq\" (UID: \"b1164c58-e91c-4b0c-93e5-28d0244988b6\") " pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.713224 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.713161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56ss\" (UniqueName: \"kubernetes.io/projected/1ab75e1c-1aff-4d37-a06b-03b4f389061d-kube-api-access-m56ss\") pod \"insights-runtime-extractor-7t9nd\" (UID: \"1ab75e1c-1aff-4d37-a06b-03b4f389061d\") " pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.759161 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.759131 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7t9nd" Apr 16 18:04:06.764898 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.764870 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:06.806891 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.806833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c118da2f-c36c-4fca-859c-34ed40076370-cert\") pod \"ingress-canary-79ghb\" (UID: \"c118da2f-c36c-4fca-859c-34ed40076370\") " pod="openshift-ingress-canary/ingress-canary-79ghb" Apr 16 18:04:06.806891 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.806868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gb8hm\" (UniqueName: \"kubernetes.io/projected/c118da2f-c36c-4fca-859c-34ed40076370-kube-api-access-gb8hm\") pod \"ingress-canary-79ghb\" (UID: \"c118da2f-c36c-4fca-859c-34ed40076370\") " pod="openshift-ingress-canary/ingress-canary-79ghb" Apr 16 18:04:06.810042 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.810020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c118da2f-c36c-4fca-859c-34ed40076370-cert\") pod \"ingress-canary-79ghb\" (UID: \"c118da2f-c36c-4fca-859c-34ed40076370\") " pod="openshift-ingress-canary/ingress-canary-79ghb" Apr 16 18:04:06.814062 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.813977 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb8hm\" (UniqueName: \"kubernetes.io/projected/c118da2f-c36c-4fca-859c-34ed40076370-kube-api-access-gb8hm\") pod \"ingress-canary-79ghb\" (UID: \"c118da2f-c36c-4fca-859c-34ed40076370\") " pod="openshift-ingress-canary/ingress-canary-79ghb" Apr 16 18:04:06.868007 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.867794 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-79ghb" Apr 16 18:04:06.899364 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.899331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6dthq"] Apr 16 18:04:06.902988 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.902965 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7t9nd"] Apr 16 18:04:06.903602 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:06.903572 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1164c58_e91c_4b0c_93e5_28d0244988b6.slice/crio-13be6d19ea72bb8e9ed391e2ca1ccb352f5f90fdedfe156973794b4d4198ff7a WatchSource:0}: Error finding container 13be6d19ea72bb8e9ed391e2ca1ccb352f5f90fdedfe156973794b4d4198ff7a: Status 404 returned error can't find the container with id 13be6d19ea72bb8e9ed391e2ca1ccb352f5f90fdedfe156973794b4d4198ff7a Apr 16 18:04:06.906810 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:06.906782 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ab75e1c_1aff_4d37_a06b_03b4f389061d.slice/crio-d8ccde4af161b1c67d1be6ee97de3f5f04ef4d3903b44011726eb31817eab805 WatchSource:0}: Error finding container d8ccde4af161b1c67d1be6ee97de3f5f04ef4d3903b44011726eb31817eab805: Status 404 returned error can't find the container with id d8ccde4af161b1c67d1be6ee97de3f5f04ef4d3903b44011726eb31817eab805 Apr 16 18:04:06.988684 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:06.988662 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-79ghb"] Apr 16 18:04:06.991890 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:06.991867 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc118da2f_c36c_4fca_859c_34ed40076370.slice/crio-9227a184b999b17a0610566cbcee9d8eaae89e151de39aba2281efe1a50af694 WatchSource:0}: Error finding container 9227a184b999b17a0610566cbcee9d8eaae89e151de39aba2281efe1a50af694: Status 404 returned error can't find the container with id 9227a184b999b17a0610566cbcee9d8eaae89e151de39aba2281efe1a50af694 Apr 16 18:04:07.643620 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.643596 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:04:07.643620 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.643610 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:04:07.646784 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.646475 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z2hb6\"" Apr 16 18:04:07.646784 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.646500 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:04:07.646784 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.646536 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qtxn\"" Apr 16 18:04:07.646784 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.646482 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:04:07.646784 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.646719 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:04:07.839577 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.839487 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7t9nd" event={"ID":"1ab75e1c-1aff-4d37-a06b-03b4f389061d","Type":"ContainerStarted","Data":"7d4817efc68b0f232a1073f1d52729c60737b11abd08f7800094d4315fda3250"} Apr 16 18:04:07.839577 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.839530 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7t9nd" event={"ID":"1ab75e1c-1aff-4d37-a06b-03b4f389061d","Type":"ContainerStarted","Data":"1d7d963b4d8ce6c4f893c0442a2d1b7fd40bf7c7fa2d83fe8e9cbe643623933b"} Apr 16 18:04:07.839577 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.839546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7t9nd" event={"ID":"1ab75e1c-1aff-4d37-a06b-03b4f389061d","Type":"ContainerStarted","Data":"d8ccde4af161b1c67d1be6ee97de3f5f04ef4d3903b44011726eb31817eab805"} Apr 16 18:04:07.840682 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.840657 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-79ghb" event={"ID":"c118da2f-c36c-4fca-859c-34ed40076370","Type":"ContainerStarted","Data":"9227a184b999b17a0610566cbcee9d8eaae89e151de39aba2281efe1a50af694"} Apr 16 18:04:07.841758 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:07.841738 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6dthq" event={"ID":"b1164c58-e91c-4b0c-93e5-28d0244988b6","Type":"ContainerStarted","Data":"13be6d19ea72bb8e9ed391e2ca1ccb352f5f90fdedfe156973794b4d4198ff7a"} Apr 16 18:04:08.848835 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:08.848651 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-79ghb" event={"ID":"c118da2f-c36c-4fca-859c-34ed40076370","Type":"ContainerStarted","Data":"d434128c68033c87ba19c74e8d98e03adbac76bc0534c92f4a2e9984feeb8fdc"} Apr 16 18:04:08.851410 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:08.851363 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6dthq" event={"ID":"b1164c58-e91c-4b0c-93e5-28d0244988b6","Type":"ContainerStarted","Data":"20cd89585a01798945297223830f65f009590aa2a52a294e35f5cc76555bab74"} Apr 16 18:04:08.866015 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:08.865961 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-79ghb" podStartSLOduration=1.192175324 podStartE2EDuration="2.865944308s" podCreationTimestamp="2026-04-16 18:04:06 +0000 UTC" firstStartedPulling="2026-04-16 18:04:06.993733447 +0000 UTC m=+41.937346480" lastFinishedPulling="2026-04-16 18:04:08.667502412 +0000 UTC m=+43.611115464" observedRunningTime="2026-04-16 18:04:08.865691446 +0000 UTC m=+43.809304508" watchObservedRunningTime="2026-04-16 18:04:08.865944308 +0000 UTC m=+43.809557361" Apr 16 18:04:09.856114 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.856065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6dthq" event={"ID":"b1164c58-e91c-4b0c-93e5-28d0244988b6","Type":"ContainerStarted","Data":"ff4b05c103675d5150da04a211152b26eeddf2976bf7e9909efdc3a57e33ce3e"} Apr 16 18:04:09.856570 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.856218 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:09.857771 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.857752 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7t9nd" event={"ID":"1ab75e1c-1aff-4d37-a06b-03b4f389061d","Type":"ContainerStarted","Data":"0ab64b20fc15e2d3b2cf5e554615a0d56d19a7aa9ea938e3d670b21795f18e8e"} Apr 16 18:04:09.872141 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.872114 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78598f7d98-jvvb4"] Apr 16 18:04:09.874041 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.874028 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:09.876079 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.876059 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:04:09.876185 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.876059 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bxlzl\"" Apr 16 18:04:09.876418 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.876399 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:04:09.876524 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.876456 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:04:09.876524 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.876475 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:04:09.876651 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.876637 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:04:09.876720 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.876705 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:04:09.876720 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.876716 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:04:09.881637 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.881283 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6dthq" podStartSLOduration=2.12206912 podStartE2EDuration="3.881267851s" podCreationTimestamp="2026-04-16 18:04:06 +0000 UTC" firstStartedPulling="2026-04-16 18:04:06.906256308 +0000 UTC m=+41.849869358" lastFinishedPulling="2026-04-16 18:04:08.665455054 +0000 UTC m=+43.609068089" observedRunningTime="2026-04-16 18:04:09.880304618 +0000 UTC m=+44.823917692" watchObservedRunningTime="2026-04-16 18:04:09.881267851 +0000 UTC m=+44.824880914" Apr 16 18:04:09.882242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.882222 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:04:09.887011 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.886988 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78598f7d98-jvvb4"] Apr 16 18:04:09.901228 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:09.901174 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7t9nd" podStartSLOduration=1.216999613 podStartE2EDuration="3.901161478s" podCreationTimestamp="2026-04-16 18:04:06 +0000 UTC" firstStartedPulling="2026-04-16 18:04:06.993603041 +0000 UTC m=+41.937216075" lastFinishedPulling="2026-04-16 18:04:09.677764892 +0000 UTC m=+44.621377940" observedRunningTime="2026-04-16 18:04:09.900440167 +0000 UTC m=+44.844053243" watchObservedRunningTime="2026-04-16 18:04:09.901161478 +0000 UTC m=+44.844774533" Apr 16 18:04:10.031795 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.031760 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-config\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.031977 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.031869 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-trusted-ca-bundle\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.031977 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.031931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-serving-cert\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.032083 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.031978 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-oauth-serving-cert\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.032083 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.032021 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-oauth-config\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.032083 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.032072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-service-ca\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.032261 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.032234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zvs2\" (UniqueName: \"kubernetes.io/projected/04a3b62b-74ad-4f16-92c3-0d072a1996c1-kube-api-access-7zvs2\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.133383 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.133335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-service-ca\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.133538 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.133421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zvs2\" (UniqueName: \"kubernetes.io/projected/04a3b62b-74ad-4f16-92c3-0d072a1996c1-kube-api-access-7zvs2\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.133538 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.133461 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-config\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.133538 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.133493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-trusted-ca-bundle\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.133643 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.133602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-serving-cert\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.133696 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.133647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-oauth-serving-cert\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.133696 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.133679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-oauth-config\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.134246 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.134230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-config\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.134307 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.134248 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-oauth-serving-cert\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.134307 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.134280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-service-ca\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.134454 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.134438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-trusted-ca-bundle\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.137870 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.137844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-serving-cert\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.137870 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.137852 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-oauth-config\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.141675 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.141655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zvs2\" (UniqueName: \"kubernetes.io/projected/04a3b62b-74ad-4f16-92c3-0d072a1996c1-kube-api-access-7zvs2\") pod \"console-78598f7d98-jvvb4\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.186450 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.186416 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:10.304835 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.304762 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78598f7d98-jvvb4"] Apr 16 18:04:10.309008 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:10.308984 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a3b62b_74ad_4f16_92c3_0d072a1996c1.slice/crio-3bd166751a9cd74b52802447f1a8143184c3174087c316a69c581627ea362700 WatchSource:0}: Error finding container 3bd166751a9cd74b52802447f1a8143184c3174087c316a69c581627ea362700: Status 404 returned error can't find the container with id 3bd166751a9cd74b52802447f1a8143184c3174087c316a69c581627ea362700 Apr 16 18:04:10.862486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:10.862450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78598f7d98-jvvb4" event={"ID":"04a3b62b-74ad-4f16-92c3-0d072a1996c1","Type":"ContainerStarted","Data":"3bd166751a9cd74b52802447f1a8143184c3174087c316a69c581627ea362700"} Apr 16 18:04:12.022870 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.022678 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-vz96w"] Apr 16 18:04:12.025314 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.025297 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.027182 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.027158 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:04:12.027643 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.027619 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:04:12.027788 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.027772 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:04:12.028079 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.028060 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-jsrfh\"" Apr 16 18:04:12.028166 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.028113 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:04:12.028233 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.028065 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:04:12.036617 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.036563 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-vz96w"] Apr 16 18:04:12.048123 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.048102 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-tmbf4"] Apr 16 18:04:12.048351 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.048324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.048448 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.048377 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkkwp\" (UniqueName: \"kubernetes.io/projected/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-kube-api-access-vkkwp\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.048448 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.048420 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.048585 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.048560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.050703 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.050682 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.051823 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.051793 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9vhnl"] Apr 16 18:04:12.052958 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.052895 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:04:12.053070 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.052966 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:04:12.053070 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.052970 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:04:12.053228 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.053077 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-xqcc8\"" Apr 16 18:04:12.054283 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.054239 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.055931 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.055913 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:04:12.056008 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.055988 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-dln79\"" Apr 16 18:04:12.056149 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.056128 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:04:12.056309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.056261 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:04:12.065251 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.065228 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-tmbf4"] Apr 16 18:04:12.149541 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149505 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ece5b02a-1c86-402d-a07a-3645d98afe73-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.149719 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.149719 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-accelerators-collector-config\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.149719 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e057c09-2a88-4a81-99a0-c209f07556a8-sys\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.149863 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149746 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-wtmp\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.149863 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.149863 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.149863 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.150034 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149893 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkkwp\" (UniqueName: \"kubernetes.io/projected/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-kube-api-access-vkkwp\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.150034 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.150034 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149957 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ece5b02a-1c86-402d-a07a-3645d98afe73-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.150034 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.149989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7e057c09-2a88-4a81-99a0-c209f07556a8-root\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.150034 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.150013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-textfile\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.150274 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.150083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-567dc\" (UniqueName: \"kubernetes.io/projected/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-api-access-567dc\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.150274 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.150116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbfm4\" (UniqueName: \"kubernetes.io/projected/7e057c09-2a88-4a81-99a0-c209f07556a8-kube-api-access-mbfm4\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.150274 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.150145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.150274 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.150233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-tls\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.150432 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.150275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e057c09-2a88-4a81-99a0-c209f07556a8-metrics-client-ca\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.150432 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.150317 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.150637 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.150619 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.153752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.153730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.153866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.153809 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.164052 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.164021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkkwp\" (UniqueName: \"kubernetes.io/projected/9bae2b33-3e4a-4468-84d7-208d6ae92a1a-kube-api-access-vkkwp\") pod \"openshift-state-metrics-5669946b84-vz96w\" (UID: \"9bae2b33-3e4a-4468-84d7-208d6ae92a1a\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.250988 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.250947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ece5b02a-1c86-402d-a07a-3645d98afe73-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.251157 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.250997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251157 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251033 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-accelerators-collector-config\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251157 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e057c09-2a88-4a81-99a0-c209f07556a8-sys\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251157 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-wtmp\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251157 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.251157 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ece5b02a-1c86-402d-a07a-3645d98afe73-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7e057c09-2a88-4a81-99a0-c209f07556a8-root\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251280 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-textfile\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-567dc\" (UniqueName: \"kubernetes.io/projected/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-api-access-567dc\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7e057c09-2a88-4a81-99a0-c209f07556a8-sys\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbfm4\" (UniqueName: \"kubernetes.io/projected/7e057c09-2a88-4a81-99a0-c209f07556a8-kube-api-access-mbfm4\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ece5b02a-1c86-402d-a07a-3645d98afe73-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-tls\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e057c09-2a88-4a81-99a0-c209f07556a8-metrics-client-ca\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7e057c09-2a88-4a81-99a0-c209f07556a8-root\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251978 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.251593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-wtmp\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.251978 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:12.251894 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:04:12.251978 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:12.251953 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-tls podName:7e057c09-2a88-4a81-99a0-c209f07556a8 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:12.7519334 +0000 UTC m=+47.695546446 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-tls") pod "node-exporter-9vhnl" (UID: "7e057c09-2a88-4a81-99a0-c209f07556a8") : secret "node-exporter-tls" not found Apr 16 18:04:12.253220 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.252369 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ece5b02a-1c86-402d-a07a-3645d98afe73-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.253220 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.252701 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.253220 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.252805 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e057c09-2a88-4a81-99a0-c209f07556a8-metrics-client-ca\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.253439 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.253278 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-textfile\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.255442 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.253715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-accelerators-collector-config\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.255442 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.255363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.255642 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.255462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.256140 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.256114 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.262213 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.262177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbfm4\" (UniqueName: \"kubernetes.io/projected/7e057c09-2a88-4a81-99a0-c209f07556a8-kube-api-access-mbfm4\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.264050 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.263992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-567dc\" (UniqueName: \"kubernetes.io/projected/ece5b02a-1c86-402d-a07a-3645d98afe73-kube-api-access-567dc\") pod \"kube-state-metrics-7479c89684-tmbf4\" (UID: \"ece5b02a-1c86-402d-a07a-3645d98afe73\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.338036 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.337963 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" Apr 16 18:04:12.365459 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.365427 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" Apr 16 18:04:12.755686 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.755647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-tls\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.758292 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.758271 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7e057c09-2a88-4a81-99a0-c209f07556a8-node-exporter-tls\") pod \"node-exporter-9vhnl\" (UID: \"7e057c09-2a88-4a81-99a0-c209f07556a8\") " pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.962214 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.962168 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-vz96w"] Apr 16 18:04:12.965463 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.965438 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-tmbf4"] Apr 16 18:04:12.967780 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:12.967754 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bae2b33_3e4a_4468_84d7_208d6ae92a1a.slice/crio-04a649aa246bf8d683af2f83bb2f1553284ab399344a247c4d9c475dfafd00fa WatchSource:0}: Error finding container 04a649aa246bf8d683af2f83bb2f1553284ab399344a247c4d9c475dfafd00fa: Status 404 returned error can't find the container with id 04a649aa246bf8d683af2f83bb2f1553284ab399344a247c4d9c475dfafd00fa Apr 16 18:04:12.968941 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:12.968915 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podece5b02a_1c86_402d_a07a_3645d98afe73.slice/crio-b6fb9f2e03499737e56ba5154577fe4cab4883d2e43d33564a04821111985743 WatchSource:0}: Error finding container b6fb9f2e03499737e56ba5154577fe4cab4883d2e43d33564a04821111985743: Status 404 returned error can't find the container with id b6fb9f2e03499737e56ba5154577fe4cab4883d2e43d33564a04821111985743 Apr 16 18:04:12.970285 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.970269 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9vhnl" Apr 16 18:04:12.977929 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:12.977907 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e057c09_2a88_4a81_99a0_c209f07556a8.slice/crio-5d405f65fe839e707a686104a958c56846e5772507ac3737231941e106a7962f WatchSource:0}: Error finding container 5d405f65fe839e707a686104a958c56846e5772507ac3737231941e106a7962f: Status 404 returned error can't find the container with id 5d405f65fe839e707a686104a958c56846e5772507ac3737231941e106a7962f Apr 16 18:04:12.995745 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:12.995723 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:04:13.004124 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.004104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.012494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.012435 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:04:13.012494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.012459 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:04:13.012760 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.012442 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:04:13.012760 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.012556 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:04:13.012760 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.012442 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:04:13.012948 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.012931 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:04:13.013006 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.012990 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:04:13.013006 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.013002 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:04:13.013106 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.013038 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:04:13.013383 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.013329 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-c4jn7\"" Apr 16 18:04:13.014519 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.014499 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:04:13.058109 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058087 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-config-out\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058176 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-config-volume\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrxfx\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-kube-api-access-qrxfx\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058465 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-web-config\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058495 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.058752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.058547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159440 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159169 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159570 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-config-out\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159570 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159527 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159673 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159741 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159800 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-config-volume\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159800 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159771 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159800 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrxfx\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-kube-api-access-qrxfx\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159925 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159925 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.159925 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-web-config\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.160068 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:13.159937 2578 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 18:04:13.160068 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:13.159994 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-main-tls podName:56e89b7d-804f-4859-824d-cca58032953e nodeName:}" failed. No retries permitted until 2026-04-16 18:04:13.659974513 +0000 UTC m=+48.603587561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "56e89b7d-804f-4859-824d-cca58032953e") : secret "alertmanager-main-tls" not found Apr 16 18:04:13.161321 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.159934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.161321 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.160317 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.161321 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.160378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.161321 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.160979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.161617 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.161568 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.162639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.162618 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-config-out\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.162841 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.162818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.162970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.162947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-config-volume\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.163392 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.163358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.163471 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.163395 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.163889 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.163869 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.164361 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.164342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.164566 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.164549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-web-config\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.169024 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.169002 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrxfx\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-kube-api-access-qrxfx\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.665610 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.665576 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.669303 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.669275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:13.873487 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.873412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" event={"ID":"9bae2b33-3e4a-4468-84d7-208d6ae92a1a","Type":"ContainerStarted","Data":"4dad4394c1857583c535ee756d36b03d30bc1b26455534e75e02ea78814ff15d"} Apr 16 18:04:13.873487 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.873470 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" event={"ID":"9bae2b33-3e4a-4468-84d7-208d6ae92a1a","Type":"ContainerStarted","Data":"2b9daf002126ea7a3bd9b1ca555c25adb011a4697543a430464326187b7bd87d"} Apr 16 18:04:13.873487 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.873489 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" event={"ID":"9bae2b33-3e4a-4468-84d7-208d6ae92a1a","Type":"ContainerStarted","Data":"04a649aa246bf8d683af2f83bb2f1553284ab399344a247c4d9c475dfafd00fa"} Apr 16 18:04:13.874605 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.874578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" event={"ID":"ece5b02a-1c86-402d-a07a-3645d98afe73","Type":"ContainerStarted","Data":"b6fb9f2e03499737e56ba5154577fe4cab4883d2e43d33564a04821111985743"} Apr 16 18:04:13.875902 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.875872 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78598f7d98-jvvb4" event={"ID":"04a3b62b-74ad-4f16-92c3-0d072a1996c1","Type":"ContainerStarted","Data":"c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30"} Apr 16 18:04:13.876874 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.876854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9vhnl" event={"ID":"7e057c09-2a88-4a81-99a0-c209f07556a8","Type":"ContainerStarted","Data":"5d405f65fe839e707a686104a958c56846e5772507ac3737231941e106a7962f"} Apr 16 18:04:13.896405 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.896364 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78598f7d98-jvvb4" podStartSLOduration=2.265504856 podStartE2EDuration="4.896351814s" podCreationTimestamp="2026-04-16 18:04:09 +0000 UTC" firstStartedPulling="2026-04-16 18:04:10.311559306 +0000 UTC m=+45.255172340" lastFinishedPulling="2026-04-16 18:04:12.94240626 +0000 UTC m=+47.886019298" observedRunningTime="2026-04-16 18:04:13.894544889 +0000 UTC m=+48.838157945" watchObservedRunningTime="2026-04-16 18:04:13.896351814 +0000 UTC m=+48.839964868" Apr 16 18:04:13.924082 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:13.923992 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:04:14.546287 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.546087 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:04:14.551834 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:14.551804 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56e89b7d_804f_4859_824d_cca58032953e.slice/crio-ac0d704ff6f33ed47fa30c9b0d627ba93ba3428a3781e8436095df2abd284bc9 WatchSource:0}: Error finding container ac0d704ff6f33ed47fa30c9b0d627ba93ba3428a3781e8436095df2abd284bc9: Status 404 returned error can't find the container with id ac0d704ff6f33ed47fa30c9b0d627ba93ba3428a3781e8436095df2abd284bc9 Apr 16 18:04:14.881607 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.881572 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e057c09-2a88-4a81-99a0-c209f07556a8" containerID="437d5ba4aed69781ab5a9c27116eb3af63072d3a73fd54e35e64fe745b5075f0" exitCode=0 Apr 16 18:04:14.881790 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.881657 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9vhnl" event={"ID":"7e057c09-2a88-4a81-99a0-c209f07556a8","Type":"ContainerDied","Data":"437d5ba4aed69781ab5a9c27116eb3af63072d3a73fd54e35e64fe745b5075f0"} Apr 16 18:04:14.882788 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.882758 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerStarted","Data":"ac0d704ff6f33ed47fa30c9b0d627ba93ba3428a3781e8436095df2abd284bc9"} Apr 16 18:04:14.884560 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.884541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" event={"ID":"9bae2b33-3e4a-4468-84d7-208d6ae92a1a","Type":"ContainerStarted","Data":"ebb016147e1cf4463e972d8f9fa0558b8a31e015b2d11ee084e7a27f0a64c0e0"} Apr 16 18:04:14.886781 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.886761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" event={"ID":"ece5b02a-1c86-402d-a07a-3645d98afe73","Type":"ContainerStarted","Data":"7d6fa7556002e84a7ec907c0f170a69223740112a5eb679f18660cbf6ae1358b"} Apr 16 18:04:14.886886 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.886788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" event={"ID":"ece5b02a-1c86-402d-a07a-3645d98afe73","Type":"ContainerStarted","Data":"a0c54e9b247f7da0d310c34268d7f57fffb5ed80890c8e7c15e7a044511acb2b"} Apr 16 18:04:14.886886 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.886802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" event={"ID":"ece5b02a-1c86-402d-a07a-3645d98afe73","Type":"ContainerStarted","Data":"2493feb661f2c3c3f729435f9a2b33a7b56cce0cab7a87c6a415150a49543b4c"} Apr 16 18:04:14.907155 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.907130 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7675b699c9-9254t"] Apr 16 18:04:14.910341 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.910326 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:14.912563 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.912541 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-r5m8h\"" Apr 16 18:04:14.912674 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.912567 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:04:14.912674 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.912551 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:04:14.912674 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.912568 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:04:14.912674 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.912660 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-9n81tkrrl2jqj\"" Apr 16 18:04:14.912881 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.912826 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:04:14.912932 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.912898 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:04:14.923743 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.923716 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7675b699c9-9254t"] Apr 16 18:04:14.926940 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.926856 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-vz96w" podStartSLOduration=1.640443654 podStartE2EDuration="2.926840191s" podCreationTimestamp="2026-04-16 18:04:12 +0000 UTC" firstStartedPulling="2026-04-16 18:04:13.124862306 +0000 UTC m=+48.068475339" lastFinishedPulling="2026-04-16 18:04:14.411258828 +0000 UTC m=+49.354871876" observedRunningTime="2026-04-16 18:04:14.926594161 +0000 UTC m=+49.870207217" watchObservedRunningTime="2026-04-16 18:04:14.926840191 +0000 UTC m=+49.870453248" Apr 16 18:04:14.953912 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.953867 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-tmbf4" podStartSLOduration=1.5137874660000001 podStartE2EDuration="2.953853258s" podCreationTimestamp="2026-04-16 18:04:12 +0000 UTC" firstStartedPulling="2026-04-16 18:04:12.97060481 +0000 UTC m=+47.914217843" lastFinishedPulling="2026-04-16 18:04:14.410670587 +0000 UTC m=+49.354283635" observedRunningTime="2026-04-16 18:04:14.952743741 +0000 UTC m=+49.896356799" watchObservedRunningTime="2026-04-16 18:04:14.953853258 +0000 UTC m=+49.897466314" Apr 16 18:04:14.976353 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.976326 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-tls\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:14.976511 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.976491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:14.976884 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.976857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f3e75c0-4839-425b-951d-c38abf6a16b5-metrics-client-ca\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:14.976980 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.976956 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:14.977382 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.977360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg5vj\" (UniqueName: \"kubernetes.io/projected/9f3e75c0-4839-425b-951d-c38abf6a16b5-kube-api-access-bg5vj\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:14.977471 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.977425 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-grpc-tls\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:14.977661 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.977637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:14.977818 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:14.977749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.078550 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.078513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.078698 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.078561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bg5vj\" (UniqueName: \"kubernetes.io/projected/9f3e75c0-4839-425b-951d-c38abf6a16b5-kube-api-access-bg5vj\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.078739 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.078695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-grpc-tls\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.078771 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.078743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.078805 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.078776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.078940 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.078919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-tls\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.079016 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.078998 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.079064 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.079041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f3e75c0-4839-425b-951d-c38abf6a16b5-metrics-client-ca\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.079743 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.079719 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f3e75c0-4839-425b-951d-c38abf6a16b5-metrics-client-ca\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.082259 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.082234 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.082376 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.082355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.082419 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.082377 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-grpc-tls\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.082419 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.082408 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.082486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.082415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.082642 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.082622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9f3e75c0-4839-425b-951d-c38abf6a16b5-secret-thanos-querier-tls\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.091202 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.091175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg5vj\" (UniqueName: \"kubernetes.io/projected/9f3e75c0-4839-425b-951d-c38abf6a16b5-kube-api-access-bg5vj\") pod \"thanos-querier-7675b699c9-9254t\" (UID: \"9f3e75c0-4839-425b-951d-c38abf6a16b5\") " pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.244086 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.244054 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:15.376055 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.376018 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7675b699c9-9254t"] Apr 16 18:04:15.383825 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:15.383799 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3e75c0_4839_425b_951d_c38abf6a16b5.slice/crio-129e0753390dd11590f009dd939ba524bfd2ec70a32787245baaff46fcad7e2e WatchSource:0}: Error finding container 129e0753390dd11590f009dd939ba524bfd2ec70a32787245baaff46fcad7e2e: Status 404 returned error can't find the container with id 129e0753390dd11590f009dd939ba524bfd2ec70a32787245baaff46fcad7e2e Apr 16 18:04:15.891136 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.890939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9vhnl" event={"ID":"7e057c09-2a88-4a81-99a0-c209f07556a8","Type":"ContainerStarted","Data":"ced89815a6a6078bbf332e20ea0c849b06f00e6ebd6260a2429253bd9a95805d"} Apr 16 18:04:15.891811 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.891152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9vhnl" event={"ID":"7e057c09-2a88-4a81-99a0-c209f07556a8","Type":"ContainerStarted","Data":"8e82af65c0864757c9f2a26fea142c35aba9d4045f6d7bb5e028dc115c7f3488"} Apr 16 18:04:15.892246 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.892221 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" event={"ID":"9f3e75c0-4839-425b-951d-c38abf6a16b5","Type":"ContainerStarted","Data":"129e0753390dd11590f009dd939ba524bfd2ec70a32787245baaff46fcad7e2e"} Apr 16 18:04:15.893674 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.893644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerStarted","Data":"4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552"} Apr 16 18:04:15.913959 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:15.913917 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9vhnl" podStartSLOduration=2.487670245 podStartE2EDuration="3.913903272s" podCreationTimestamp="2026-04-16 18:04:12 +0000 UTC" firstStartedPulling="2026-04-16 18:04:12.979308128 +0000 UTC m=+47.922921166" lastFinishedPulling="2026-04-16 18:04:14.405541144 +0000 UTC m=+49.349154193" observedRunningTime="2026-04-16 18:04:15.909619465 +0000 UTC m=+50.853232508" watchObservedRunningTime="2026-04-16 18:04:15.913903272 +0000 UTC m=+50.857516328" Apr 16 18:04:16.330381 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.330339 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-78dcd49d79-g868h"] Apr 16 18:04:16.333048 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.333021 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.335127 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.334871 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:04:16.335127 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.334990 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:04:16.335127 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.334992 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-pkb153huv1g1\"" Apr 16 18:04:16.335127 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.335020 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:04:16.335127 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.335099 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:04:16.335420 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.335406 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-fwqq9\"" Apr 16 18:04:16.343316 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.343294 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-78dcd49d79-g868h"] Apr 16 18:04:16.392342 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.392304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/df893896-7d60-4285-a325-30152ee3c5bd-metrics-server-audit-profiles\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.392524 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.392346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df893896-7d60-4285-a325-30152ee3c5bd-client-ca-bundle\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.392524 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.392386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/df893896-7d60-4285-a325-30152ee3c5bd-secret-metrics-server-tls\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.392524 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.392410 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df893896-7d60-4285-a325-30152ee3c5bd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.392524 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.392506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slk4p\" (UniqueName: \"kubernetes.io/projected/df893896-7d60-4285-a325-30152ee3c5bd-kube-api-access-slk4p\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.392734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.392555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/df893896-7d60-4285-a325-30152ee3c5bd-audit-log\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.392734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.392623 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/df893896-7d60-4285-a325-30152ee3c5bd-secret-metrics-server-client-certs\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.493254 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.493219 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/df893896-7d60-4285-a325-30152ee3c5bd-audit-log\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.493419 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.493289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/df893896-7d60-4285-a325-30152ee3c5bd-secret-metrics-server-client-certs\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.493419 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.493357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/df893896-7d60-4285-a325-30152ee3c5bd-metrics-server-audit-profiles\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.493419 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.493386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df893896-7d60-4285-a325-30152ee3c5bd-client-ca-bundle\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.493600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.493422 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/df893896-7d60-4285-a325-30152ee3c5bd-secret-metrics-server-tls\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.493600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.493555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df893896-7d60-4285-a325-30152ee3c5bd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.493702 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.493628 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slk4p\" (UniqueName: \"kubernetes.io/projected/df893896-7d60-4285-a325-30152ee3c5bd-kube-api-access-slk4p\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.493755 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.493693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/df893896-7d60-4285-a325-30152ee3c5bd-audit-log\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.494376 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.494355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/df893896-7d60-4285-a325-30152ee3c5bd-metrics-server-audit-profiles\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.494834 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.494813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df893896-7d60-4285-a325-30152ee3c5bd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.496902 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.496854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/df893896-7d60-4285-a325-30152ee3c5bd-secret-metrics-server-tls\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.497023 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.496961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/df893896-7d60-4285-a325-30152ee3c5bd-secret-metrics-server-client-certs\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.497108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.497031 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df893896-7d60-4285-a325-30152ee3c5bd-client-ca-bundle\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.502946 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.502927 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slk4p\" (UniqueName: \"kubernetes.io/projected/df893896-7d60-4285-a325-30152ee3c5bd-kube-api-access-slk4p\") pod \"metrics-server-78dcd49d79-g868h\" (UID: \"df893896-7d60-4285-a325-30152ee3c5bd\") " pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.643469 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.643381 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:16.685351 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.685313 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78598f7d98-jvvb4"] Apr 16 18:04:16.689244 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.689217 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6"] Apr 16 18:04:16.693095 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.693074 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" Apr 16 18:04:16.696168 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.696148 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:04:16.696314 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.696298 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-d24b2\"" Apr 16 18:04:16.707043 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.707023 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6"] Apr 16 18:04:16.715882 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.715852 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b49d59ddb-ddqd9"] Apr 16 18:04:16.719254 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.719234 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.732273 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.732233 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b49d59ddb-ddqd9"] Apr 16 18:04:16.795958 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.795923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6nt\" (UniqueName: \"kubernetes.io/projected/738fb93b-b947-4be6-8ec1-29fdc78da521-kube-api-access-ct6nt\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.796161 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.795992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-trusted-ca-bundle\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.796161 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.796021 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-oauth-serving-cert\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.796161 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.796042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-service-ca\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.796161 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.796069 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/008efc52-3ff5-42b0-985a-90b8699c1cda-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-52qb6\" (UID: \"008efc52-3ff5-42b0-985a-90b8699c1cda\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" Apr 16 18:04:16.796161 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.796117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-serving-cert\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.796446 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.796168 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-console-config\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.796446 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.796207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-oauth-config\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.898084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.897415 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6nt\" (UniqueName: \"kubernetes.io/projected/738fb93b-b947-4be6-8ec1-29fdc78da521-kube-api-access-ct6nt\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.898084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.897494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-trusted-ca-bundle\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.898084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.897524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-oauth-serving-cert\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.898084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.897580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-service-ca\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.898084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.897632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/008efc52-3ff5-42b0-985a-90b8699c1cda-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-52qb6\" (UID: \"008efc52-3ff5-42b0-985a-90b8699c1cda\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" Apr 16 18:04:16.898084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.897658 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-serving-cert\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.898084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.897682 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-console-config\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.898084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.897716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-oauth-config\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.898916 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.898791 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-service-ca\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.899436 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.899175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-trusted-ca-bundle\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.899436 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:16.899301 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:04:16.899436 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:16.899367 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008efc52-3ff5-42b0-985a-90b8699c1cda-monitoring-plugin-cert podName:008efc52-3ff5-42b0-985a-90b8699c1cda nodeName:}" failed. No retries permitted until 2026-04-16 18:04:17.399350309 +0000 UTC m=+52.342963343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/008efc52-3ff5-42b0-985a-90b8699c1cda-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-52qb6" (UID: "008efc52-3ff5-42b0-985a-90b8699c1cda") : secret "monitoring-plugin-cert" not found Apr 16 18:04:16.899436 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.899364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-oauth-serving-cert\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.899799 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.899779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-console-config\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.901631 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.901598 2578 generic.go:358] "Generic (PLEG): container finished" podID="56e89b7d-804f-4859-824d-cca58032953e" containerID="4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552" exitCode=0 Apr 16 18:04:16.901742 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.901693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-serving-cert\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.901867 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.901846 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerDied","Data":"4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552"} Apr 16 18:04:16.901965 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.901944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-oauth-config\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:16.906716 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:16.906692 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6nt\" (UniqueName: \"kubernetes.io/projected/738fb93b-b947-4be6-8ec1-29fdc78da521-kube-api-access-ct6nt\") pod \"console-6b49d59ddb-ddqd9\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:17.031257 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.031220 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:17.304135 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.304108 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-78dcd49d79-g868h"] Apr 16 18:04:17.313486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.313457 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b49d59ddb-ddqd9"] Apr 16 18:04:17.317344 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:17.317293 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod738fb93b_b947_4be6_8ec1_29fdc78da521.slice/crio-58f19e437430dbbd3413a080f4970d8a1eb36b550dbba1ca6c394753d90cee80 WatchSource:0}: Error finding container 58f19e437430dbbd3413a080f4970d8a1eb36b550dbba1ca6c394753d90cee80: Status 404 returned error can't find the container with id 58f19e437430dbbd3413a080f4970d8a1eb36b550dbba1ca6c394753d90cee80 Apr 16 18:04:17.317573 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:17.317545 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf893896_7d60_4285_a325_30152ee3c5bd.slice/crio-b6704eeebce61662b5f2d1bb540417877138103a770ed2336d5b01a864133460 WatchSource:0}: Error finding container b6704eeebce61662b5f2d1bb540417877138103a770ed2336d5b01a864133460: Status 404 returned error can't find the container with id b6704eeebce61662b5f2d1bb540417877138103a770ed2336d5b01a864133460 Apr 16 18:04:17.404579 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.404310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/008efc52-3ff5-42b0-985a-90b8699c1cda-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-52qb6\" (UID: \"008efc52-3ff5-42b0-985a-90b8699c1cda\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" Apr 16 18:04:17.404685 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:17.404452 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:04:17.404763 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:17.404736 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/008efc52-3ff5-42b0-985a-90b8699c1cda-monitoring-plugin-cert podName:008efc52-3ff5-42b0-985a-90b8699c1cda nodeName:}" failed. No retries permitted until 2026-04-16 18:04:18.40471153 +0000 UTC m=+53.348324576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/008efc52-3ff5-42b0-985a-90b8699c1cda-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-52qb6" (UID: "008efc52-3ff5-42b0-985a-90b8699c1cda") : secret "monitoring-plugin-cert" not found Apr 16 18:04:17.906985 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.906950 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" event={"ID":"9f3e75c0-4839-425b-951d-c38abf6a16b5","Type":"ContainerStarted","Data":"b24276538d5be17a54e3d8940f043af1707b990510a8369b667797c7a7037428"} Apr 16 18:04:17.906985 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.906982 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" event={"ID":"9f3e75c0-4839-425b-951d-c38abf6a16b5","Type":"ContainerStarted","Data":"cf75a9e4c0e8cbbce79809ad6d71e1666e21d4e6d9abbc61aa71980a4d26deef"} Apr 16 18:04:17.906985 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.906991 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" event={"ID":"9f3e75c0-4839-425b-951d-c38abf6a16b5","Type":"ContainerStarted","Data":"bede28464ee116d7da05ec5001269074bc42993ea4ab8cb5dc1a193f66cfce2b"} Apr 16 18:04:17.907991 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.907971 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" event={"ID":"df893896-7d60-4285-a325-30152ee3c5bd","Type":"ContainerStarted","Data":"b6704eeebce61662b5f2d1bb540417877138103a770ed2336d5b01a864133460"} Apr 16 18:04:17.909076 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.909056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b49d59ddb-ddqd9" event={"ID":"738fb93b-b947-4be6-8ec1-29fdc78da521","Type":"ContainerStarted","Data":"52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd"} Apr 16 18:04:17.909156 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.909080 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b49d59ddb-ddqd9" event={"ID":"738fb93b-b947-4be6-8ec1-29fdc78da521","Type":"ContainerStarted","Data":"58f19e437430dbbd3413a080f4970d8a1eb36b550dbba1ca6c394753d90cee80"} Apr 16 18:04:17.930710 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:17.930669 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b49d59ddb-ddqd9" podStartSLOduration=1.930657422 podStartE2EDuration="1.930657422s" podCreationTimestamp="2026-04-16 18:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:17.929113987 +0000 UTC m=+52.872727090" watchObservedRunningTime="2026-04-16 18:04:17.930657422 +0000 UTC m=+52.874270478" Apr 16 18:04:18.064910 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.064880 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:04:18.088469 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.088432 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.089234 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.089206 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:04:18.090956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.090595 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:04:18.090956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.090595 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:04:18.090956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.090824 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8dbtl4qhbmqub\"" Apr 16 18:04:18.090956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.090881 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:04:18.091263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.091035 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:04:18.091263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.091127 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:04:18.091263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.091214 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:04:18.091808 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.091713 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:04:18.091957 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.091939 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:04:18.092090 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.091967 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:04:18.092708 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.092281 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:04:18.092708 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.092309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-25p8q\"" Apr 16 18:04:18.092708 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.092547 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:04:18.093985 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.093968 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:04:18.110661 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.110791 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-config-out\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.110791 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110715 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.110791 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.110791 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110813 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-config\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.110992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111396 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.111044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v665\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-kube-api-access-8v665\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111396 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.111079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111396 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.111118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111396 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.111200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.111396 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.111243 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-web-config\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.211945 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.211910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212128 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.211958 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212128 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.211987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212273 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.212167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212273 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.212230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212273 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.212265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-config\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212422 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.212290 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212422 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.212351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v665\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-kube-api-access-8v665\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212422 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.212390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212801 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.212431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.212904 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.212878 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.213776 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.213587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.213776 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.213591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-web-config\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.213776 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.213650 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.213776 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.213732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.214087 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.213781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-config-out\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.214087 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.213839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.214087 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.214007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.214264 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.214055 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.214264 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.214171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.215897 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.215870 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.217158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.216820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.217158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.217072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-web-config\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.223325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.219554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.223325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.219926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-config\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.223325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.221109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.223325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.222325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.223325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.222772 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.223325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.222973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.224034 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.223902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.224034 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.223970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-config-out\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.224205 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.224096 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.226079 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.226057 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.226985 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.226957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.229012 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.228970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v665\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-kube-api-access-8v665\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.233968 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.233926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.401295 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.401260 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:18.417235 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.417171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/008efc52-3ff5-42b0-985a-90b8699c1cda-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-52qb6\" (UID: \"008efc52-3ff5-42b0-985a-90b8699c1cda\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" Apr 16 18:04:18.420206 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.420140 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/008efc52-3ff5-42b0-985a-90b8699c1cda-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-52qb6\" (UID: \"008efc52-3ff5-42b0-985a-90b8699c1cda\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" Apr 16 18:04:18.504915 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:18.504855 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" Apr 16 18:04:19.754494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:19.754423 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:04:19.756027 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:19.755987 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6"] Apr 16 18:04:19.759743 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:19.759719 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73a56681_2392_44d7_8a0c_d00a834d6160.slice/crio-cc2f9d39c7ba79b73d4abb18e993b447505c59ce4318015ba698f0873a43b3ba WatchSource:0}: Error finding container cc2f9d39c7ba79b73d4abb18e993b447505c59ce4318015ba698f0873a43b3ba: Status 404 returned error can't find the container with id cc2f9d39c7ba79b73d4abb18e993b447505c59ce4318015ba698f0873a43b3ba Apr 16 18:04:19.760878 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:19.760855 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod008efc52_3ff5_42b0_985a_90b8699c1cda.slice/crio-c2e6afa70d5f1b0f7088905eb761883decc772939e2bfe89675242bd07e93944 WatchSource:0}: Error finding container c2e6afa70d5f1b0f7088905eb761883decc772939e2bfe89675242bd07e93944: Status 404 returned error can't find the container with id c2e6afa70d5f1b0f7088905eb761883decc772939e2bfe89675242bd07e93944 Apr 16 18:04:19.864579 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:19.864553 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6dthq" Apr 16 18:04:19.916971 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:19.916890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerStarted","Data":"cc2f9d39c7ba79b73d4abb18e993b447505c59ce4318015ba698f0873a43b3ba"} Apr 16 18:04:19.917990 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:19.917957 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" event={"ID":"008efc52-3ff5-42b0-985a-90b8699c1cda","Type":"ContainerStarted","Data":"c2e6afa70d5f1b0f7088905eb761883decc772939e2bfe89675242bd07e93944"} Apr 16 18:04:19.919772 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:19.919753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerStarted","Data":"777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648"} Apr 16 18:04:19.921226 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:19.921182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" event={"ID":"df893896-7d60-4285-a325-30152ee3c5bd","Type":"ContainerStarted","Data":"6600f25a8c9b1402c1b011d3a889da98caa8a25a37d88a80191ef392af72cb91"} Apr 16 18:04:19.939750 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:19.939696 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" podStartSLOduration=1.6765315589999998 podStartE2EDuration="3.93967773s" podCreationTimestamp="2026-04-16 18:04:16 +0000 UTC" firstStartedPulling="2026-04-16 18:04:17.319574106 +0000 UTC m=+52.263187140" lastFinishedPulling="2026-04-16 18:04:19.582720256 +0000 UTC m=+54.526333311" observedRunningTime="2026-04-16 18:04:19.93795503 +0000 UTC m=+54.881568129" watchObservedRunningTime="2026-04-16 18:04:19.93967773 +0000 UTC m=+54.883290787" Apr 16 18:04:20.186844 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.186821 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:20.927431 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.927396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" event={"ID":"9f3e75c0-4839-425b-951d-c38abf6a16b5","Type":"ContainerStarted","Data":"a8cc986b3842f2063bcec2df29408c79fbd2db2cb772dba60fffe4778e1bdcb1"} Apr 16 18:04:20.927431 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.927437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" event={"ID":"9f3e75c0-4839-425b-951d-c38abf6a16b5","Type":"ContainerStarted","Data":"71a57f7b1f7d74726647788b1897ef27c954a7ca035e879c76ad396ebd5b5572"} Apr 16 18:04:20.927910 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.927451 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" event={"ID":"9f3e75c0-4839-425b-951d-c38abf6a16b5","Type":"ContainerStarted","Data":"da86349fa87581760af67afa0b19a9248aa9d12a80d957df92b1f7172ccec253"} Apr 16 18:04:20.927910 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.927644 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:20.929055 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.929029 2578 generic.go:358] "Generic (PLEG): container finished" podID="73a56681-2392-44d7-8a0c-d00a834d6160" containerID="93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34" exitCode=0 Apr 16 18:04:20.929211 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.929116 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerDied","Data":"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34"} Apr 16 18:04:20.932706 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.932670 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerStarted","Data":"54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f"} Apr 16 18:04:20.932706 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.932699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerStarted","Data":"fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734"} Apr 16 18:04:20.932706 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.932709 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerStarted","Data":"b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196"} Apr 16 18:04:20.932900 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.932718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerStarted","Data":"812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b"} Apr 16 18:04:20.932900 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.932727 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerStarted","Data":"52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f"} Apr 16 18:04:20.954015 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.953967 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" podStartSLOduration=2.326001027 podStartE2EDuration="6.953953779s" podCreationTimestamp="2026-04-16 18:04:14 +0000 UTC" firstStartedPulling="2026-04-16 18:04:15.385977304 +0000 UTC m=+50.329590338" lastFinishedPulling="2026-04-16 18:04:20.013930043 +0000 UTC m=+54.957543090" observedRunningTime="2026-04-16 18:04:20.951823909 +0000 UTC m=+55.895436977" watchObservedRunningTime="2026-04-16 18:04:20.953953779 +0000 UTC m=+55.897566834" Apr 16 18:04:20.981596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:20.981554 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.953855083 podStartE2EDuration="8.981541629s" podCreationTimestamp="2026-04-16 18:04:12 +0000 UTC" firstStartedPulling="2026-04-16 18:04:14.554045035 +0000 UTC m=+49.497658070" lastFinishedPulling="2026-04-16 18:04:19.581731565 +0000 UTC m=+54.525344616" observedRunningTime="2026-04-16 18:04:20.980640051 +0000 UTC m=+55.924253130" watchObservedRunningTime="2026-04-16 18:04:20.981541629 +0000 UTC m=+55.925154685" Apr 16 18:04:21.937760 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:21.937722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" event={"ID":"008efc52-3ff5-42b0-985a-90b8699c1cda","Type":"ContainerStarted","Data":"2ae8780ae426e56e51307cb4b6d3603955880ad0b4ecb3809f1dfb6b91db4f92"} Apr 16 18:04:21.938410 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:21.938356 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" Apr 16 18:04:21.945561 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:21.945539 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" Apr 16 18:04:21.957334 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:21.957281 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-52qb6" podStartSLOduration=3.94706295 podStartE2EDuration="5.957267863s" podCreationTimestamp="2026-04-16 18:04:16 +0000 UTC" firstStartedPulling="2026-04-16 18:04:19.763386583 +0000 UTC m=+54.706999631" lastFinishedPulling="2026-04-16 18:04:21.773591501 +0000 UTC m=+56.717204544" observedRunningTime="2026-04-16 18:04:21.955720928 +0000 UTC m=+56.899333983" watchObservedRunningTime="2026-04-16 18:04:21.957267863 +0000 UTC m=+56.900880918" Apr 16 18:04:24.833651 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:24.833523 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2pmhr" Apr 16 18:04:24.956227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:24.956144 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerStarted","Data":"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b"} Apr 16 18:04:24.956227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:24.956177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerStarted","Data":"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91"} Apr 16 18:04:24.956227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:24.956214 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerStarted","Data":"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd"} Apr 16 18:04:24.956227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:24.956227 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerStarted","Data":"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b"} Apr 16 18:04:24.956445 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:24.956239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerStarted","Data":"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67"} Apr 16 18:04:25.962491 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:25.962462 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerStarted","Data":"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d"} Apr 16 18:04:26.004883 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:26.004810 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.500753748 podStartE2EDuration="8.004795225s" podCreationTimestamp="2026-04-16 18:04:18 +0000 UTC" firstStartedPulling="2026-04-16 18:04:20.930564742 +0000 UTC m=+55.874177796" lastFinishedPulling="2026-04-16 18:04:24.434606236 +0000 UTC m=+59.378219273" observedRunningTime="2026-04-16 18:04:26.003237235 +0000 UTC m=+60.946850309" watchObservedRunningTime="2026-04-16 18:04:26.004795225 +0000 UTC m=+60.948408277" Apr 16 18:04:26.944748 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:26.944718 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7675b699c9-9254t" Apr 16 18:04:27.031926 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:27.031896 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:27.032286 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:27.031935 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:27.036480 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:27.036458 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:27.971890 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:27.971850 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:04:28.402102 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:28.402008 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:31.347869 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.347821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:04:31.350613 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.350590 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:04:31.360833 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.360804 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0-metrics-certs\") pod \"network-metrics-daemon-t49dd\" (UID: \"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0\") " pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:04:31.448894 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.448851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:04:31.451408 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.451392 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:04:31.461234 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.461214 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:04:31.471757 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.471726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9rpn\" (UniqueName: \"kubernetes.io/projected/4af28085-16ca-4a81-b155-9c85f1f05a68-kube-api-access-r9rpn\") pod \"network-check-target-9qqgk\" (UID: \"4af28085-16ca-4a81-b155-9c85f1f05a68\") " pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:04:31.662290 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.662212 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qtxn\"" Apr 16 18:04:31.670106 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.670084 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t49dd" Apr 16 18:04:31.670357 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.670341 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z2hb6\"" Apr 16 18:04:31.679144 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.679111 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:04:31.802967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.802929 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t49dd"] Apr 16 18:04:31.805774 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:31.805747 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87aa22c0_e9f4_4d96_b5ce_9ac0e4521ab0.slice/crio-5e5a03dfb1aca5ef745b62ab516733b4f7e7f68dc0fd4055a402e21fea6ea1ec WatchSource:0}: Error finding container 5e5a03dfb1aca5ef745b62ab516733b4f7e7f68dc0fd4055a402e21fea6ea1ec: Status 404 returned error can't find the container with id 5e5a03dfb1aca5ef745b62ab516733b4f7e7f68dc0fd4055a402e21fea6ea1ec Apr 16 18:04:31.823776 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.823746 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9qqgk"] Apr 16 18:04:31.826417 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:04:31.826393 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af28085_16ca_4a81_b155_9c85f1f05a68.slice/crio-aedd83143241449d0104f3765763a01fb2a98be766d9d879795d93777dea7d11 WatchSource:0}: Error finding container aedd83143241449d0104f3765763a01fb2a98be766d9d879795d93777dea7d11: Status 404 returned error can't find the container with id aedd83143241449d0104f3765763a01fb2a98be766d9d879795d93777dea7d11 Apr 16 18:04:31.982126 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.982091 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t49dd" event={"ID":"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0","Type":"ContainerStarted","Data":"5e5a03dfb1aca5ef745b62ab516733b4f7e7f68dc0fd4055a402e21fea6ea1ec"} Apr 16 18:04:31.983058 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:31.983037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9qqgk" event={"ID":"4af28085-16ca-4a81-b155-9c85f1f05a68","Type":"ContainerStarted","Data":"aedd83143241449d0104f3765763a01fb2a98be766d9d879795d93777dea7d11"} Apr 16 18:04:33.991695 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:33.991663 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t49dd" event={"ID":"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0","Type":"ContainerStarted","Data":"1aad9fe99d5bef9fbc8f68cce53761ac1afae9e54478f9bc0c2bb13b26379b4f"} Apr 16 18:04:33.992111 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:33.991700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t49dd" event={"ID":"87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0","Type":"ContainerStarted","Data":"0c9fec31bc5d6b198aa3680996604348db5a55fe7164038d689a14aec0ecf06c"} Apr 16 18:04:34.011125 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:34.011069 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-t49dd" podStartSLOduration=67.467448123 podStartE2EDuration="1m9.011049914s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:04:31.808185981 +0000 UTC m=+66.751799016" lastFinishedPulling="2026-04-16 18:04:33.351787772 +0000 UTC m=+68.295400807" observedRunningTime="2026-04-16 18:04:34.009281162 +0000 UTC m=+68.952894230" watchObservedRunningTime="2026-04-16 18:04:34.011049914 +0000 UTC m=+68.954662972" Apr 16 18:04:34.995843 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:34.995812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9qqgk" event={"ID":"4af28085-16ca-4a81-b155-9c85f1f05a68","Type":"ContainerStarted","Data":"fe3d4c0f4e6a12f358b2ab98d47829870f55098bbad7df78ddbfc49bbc155715"} Apr 16 18:04:35.018420 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:35.018376 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9qqgk" podStartSLOduration=67.194465011 podStartE2EDuration="1m10.018363161s" podCreationTimestamp="2026-04-16 18:03:25 +0000 UTC" firstStartedPulling="2026-04-16 18:04:31.828235419 +0000 UTC m=+66.771848456" lastFinishedPulling="2026-04-16 18:04:34.652133568 +0000 UTC m=+69.595746606" observedRunningTime="2026-04-16 18:04:35.016087429 +0000 UTC m=+69.959700488" watchObservedRunningTime="2026-04-16 18:04:35.018363161 +0000 UTC m=+69.961976217" Apr 16 18:04:35.999258 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:35.999225 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:04:36.643762 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:36.643729 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:36.643762 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:36.643768 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:41.708942 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:41.708876 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78598f7d98-jvvb4" podUID="04a3b62b-74ad-4f16-92c3-0d072a1996c1" containerName="console" containerID="cri-o://c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30" gracePeriod=15 Apr 16 18:04:41.946150 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:41.946130 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78598f7d98-jvvb4_04a3b62b-74ad-4f16-92c3-0d072a1996c1/console/0.log" Apr 16 18:04:41.946291 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:41.946202 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:42.018144 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.018067 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78598f7d98-jvvb4_04a3b62b-74ad-4f16-92c3-0d072a1996c1/console/0.log" Apr 16 18:04:42.018144 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.018108 2578 generic.go:358] "Generic (PLEG): container finished" podID="04a3b62b-74ad-4f16-92c3-0d072a1996c1" containerID="c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30" exitCode=2 Apr 16 18:04:42.018368 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.018141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78598f7d98-jvvb4" event={"ID":"04a3b62b-74ad-4f16-92c3-0d072a1996c1","Type":"ContainerDied","Data":"c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30"} Apr 16 18:04:42.018368 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.018172 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78598f7d98-jvvb4" Apr 16 18:04:42.018368 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.018202 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78598f7d98-jvvb4" event={"ID":"04a3b62b-74ad-4f16-92c3-0d072a1996c1","Type":"ContainerDied","Data":"3bd166751a9cd74b52802447f1a8143184c3174087c316a69c581627ea362700"} Apr 16 18:04:42.018368 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.018224 2578 scope.go:117] "RemoveContainer" containerID="c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30" Apr 16 18:04:42.025881 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.025855 2578 scope.go:117] "RemoveContainer" containerID="c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30" Apr 16 18:04:42.026138 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:04:42.026121 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30\": container with ID starting with c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30 not found: ID does not exist" containerID="c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30" Apr 16 18:04:42.026199 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.026147 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30"} err="failed to get container status \"c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30\": rpc error: code = NotFound desc = could not find container \"c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30\": container with ID starting with c8620902c0eefbd81b48b16048d236e72a6bd050ac4da4ca6026298c11386e30 not found: ID does not exist" Apr 16 18:04:42.043446 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043426 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-serving-cert\") pod \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " Apr 16 18:04:42.043552 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043460 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-oauth-config\") pod \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " Apr 16 18:04:42.043552 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043482 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-trusted-ca-bundle\") pod \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " Apr 16 18:04:42.043552 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043502 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-oauth-serving-cert\") pod \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " Apr 16 18:04:42.043698 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043630 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-service-ca\") pod \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " Apr 16 18:04:42.043752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043724 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zvs2\" (UniqueName: \"kubernetes.io/projected/04a3b62b-74ad-4f16-92c3-0d072a1996c1-kube-api-access-7zvs2\") pod \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " Apr 16 18:04:42.043808 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043750 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-config\") pod \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\" (UID: \"04a3b62b-74ad-4f16-92c3-0d072a1996c1\") " Apr 16 18:04:42.043937 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043911 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "04a3b62b-74ad-4f16-92c3-0d072a1996c1" (UID: "04a3b62b-74ad-4f16-92c3-0d072a1996c1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:42.043996 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043920 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "04a3b62b-74ad-4f16-92c3-0d072a1996c1" (UID: "04a3b62b-74ad-4f16-92c3-0d072a1996c1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:42.043996 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.043961 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-service-ca" (OuterVolumeSpecName: "service-ca") pod "04a3b62b-74ad-4f16-92c3-0d072a1996c1" (UID: "04a3b62b-74ad-4f16-92c3-0d072a1996c1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:42.044106 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.044041 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-trusted-ca-bundle\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:04:42.044158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.044055 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-oauth-serving-cert\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:04:42.044158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.044127 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-service-ca\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:04:42.044325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.044302 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-config" (OuterVolumeSpecName: "console-config") pod "04a3b62b-74ad-4f16-92c3-0d072a1996c1" (UID: "04a3b62b-74ad-4f16-92c3-0d072a1996c1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:04:42.045721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.045694 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a3b62b-74ad-4f16-92c3-0d072a1996c1-kube-api-access-7zvs2" (OuterVolumeSpecName: "kube-api-access-7zvs2") pod "04a3b62b-74ad-4f16-92c3-0d072a1996c1" (UID: "04a3b62b-74ad-4f16-92c3-0d072a1996c1"). InnerVolumeSpecName "kube-api-access-7zvs2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:42.045821 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.045721 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "04a3b62b-74ad-4f16-92c3-0d072a1996c1" (UID: "04a3b62b-74ad-4f16-92c3-0d072a1996c1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:42.045821 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.045701 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "04a3b62b-74ad-4f16-92c3-0d072a1996c1" (UID: "04a3b62b-74ad-4f16-92c3-0d072a1996c1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:42.144824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.144776 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-serving-cert\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:04:42.144824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.144817 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-oauth-config\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:04:42.144824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.144828 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zvs2\" (UniqueName: \"kubernetes.io/projected/04a3b62b-74ad-4f16-92c3-0d072a1996c1-kube-api-access-7zvs2\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:04:42.144824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.144837 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04a3b62b-74ad-4f16-92c3-0d072a1996c1-console-config\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:04:42.339873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.339825 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78598f7d98-jvvb4"] Apr 16 18:04:42.343417 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:42.343391 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78598f7d98-jvvb4"] Apr 16 18:04:43.648518 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:43.648486 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a3b62b-74ad-4f16-92c3-0d072a1996c1" path="/var/lib/kubelet/pods/04a3b62b-74ad-4f16-92c3-0d072a1996c1/volumes" Apr 16 18:04:54.810658 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:54.810622 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_56e89b7d-804f-4859-824d-cca58032953e/init-config-reloader/0.log" Apr 16 18:04:55.012245 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:55.012216 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_56e89b7d-804f-4859-824d-cca58032953e/alertmanager/0.log" Apr 16 18:04:55.210523 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:55.210501 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_56e89b7d-804f-4859-824d-cca58032953e/config-reloader/0.log" Apr 16 18:04:55.410881 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:55.410856 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_56e89b7d-804f-4859-824d-cca58032953e/kube-rbac-proxy-web/0.log" Apr 16 18:04:55.611432 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:55.611369 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_56e89b7d-804f-4859-824d-cca58032953e/kube-rbac-proxy/0.log" Apr 16 18:04:55.812984 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:55.812955 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_56e89b7d-804f-4859-824d-cca58032953e/kube-rbac-proxy-metric/0.log" Apr 16 18:04:56.014231 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:56.014184 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_56e89b7d-804f-4859-824d-cca58032953e/prom-label-proxy/0.log" Apr 16 18:04:56.415715 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:56.415649 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-tmbf4_ece5b02a-1c86-402d-a07a-3645d98afe73/kube-state-metrics/0.log" Apr 16 18:04:56.420248 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:56.420226 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:56.440379 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:56.440354 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:56.611540 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:56.611513 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-tmbf4_ece5b02a-1c86-402d-a07a-3645d98afe73/kube-rbac-proxy-main/0.log" Apr 16 18:04:56.649424 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:56.649399 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:56.653153 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:56.653135 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-78dcd49d79-g868h" Apr 16 18:04:56.810975 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:56.810935 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-tmbf4_ece5b02a-1c86-402d-a07a-3645d98afe73/kube-rbac-proxy-self/0.log" Apr 16 18:04:57.010301 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:57.010276 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-78dcd49d79-g868h_df893896-7d60-4285-a325-30152ee3c5bd/metrics-server/0.log" Apr 16 18:04:57.076551 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:57.076469 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:04:57.210602 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:57.210576 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-52qb6_008efc52-3ff5-42b0-985a-90b8699c1cda/monitoring-plugin/0.log" Apr 16 18:04:57.414475 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:57.414408 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9vhnl_7e057c09-2a88-4a81-99a0-c209f07556a8/init-textfile/0.log" Apr 16 18:04:57.611297 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:57.611272 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9vhnl_7e057c09-2a88-4a81-99a0-c209f07556a8/node-exporter/0.log" Apr 16 18:04:57.810771 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:57.810744 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9vhnl_7e057c09-2a88-4a81-99a0-c209f07556a8/kube-rbac-proxy/0.log" Apr 16 18:04:59.209669 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:59.209641 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-vz96w_9bae2b33-3e4a-4468-84d7-208d6ae92a1a/kube-rbac-proxy-main/0.log" Apr 16 18:04:59.410992 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:59.410968 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-vz96w_9bae2b33-3e4a-4468-84d7-208d6ae92a1a/kube-rbac-proxy-self/0.log" Apr 16 18:04:59.610246 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:59.610145 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-vz96w_9bae2b33-3e4a-4468-84d7-208d6ae92a1a/openshift-state-metrics/0.log" Apr 16 18:04:59.811025 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:04:59.810998 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73a56681-2392-44d7-8a0c-d00a834d6160/init-config-reloader/0.log" Apr 16 18:05:00.011510 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:00.011483 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73a56681-2392-44d7-8a0c-d00a834d6160/prometheus/0.log" Apr 16 18:05:00.215935 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:00.215906 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73a56681-2392-44d7-8a0c-d00a834d6160/config-reloader/0.log" Apr 16 18:05:00.410914 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:00.410844 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73a56681-2392-44d7-8a0c-d00a834d6160/thanos-sidecar/0.log" Apr 16 18:05:00.610923 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:00.610869 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73a56681-2392-44d7-8a0c-d00a834d6160/kube-rbac-proxy-web/0.log" Apr 16 18:05:00.811427 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:00.811399 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73a56681-2392-44d7-8a0c-d00a834d6160/kube-rbac-proxy/0.log" Apr 16 18:05:01.012136 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:01.012109 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73a56681-2392-44d7-8a0c-d00a834d6160/kube-rbac-proxy-thanos/0.log" Apr 16 18:05:01.811218 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:01.811181 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/thanos-query/0.log" Apr 16 18:05:02.010733 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.010706 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/kube-rbac-proxy-web/0.log" Apr 16 18:05:02.142698 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.142602 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:02.143140 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.143108 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="alertmanager" containerID="cri-o://777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648" gracePeriod=120 Apr 16 18:05:02.143262 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.143217 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy" containerID="cri-o://b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196" gracePeriod=120 Apr 16 18:05:02.143262 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.143207 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy-web" containerID="cri-o://812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b" gracePeriod=120 Apr 16 18:05:02.143368 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.143285 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="config-reloader" containerID="cri-o://52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f" gracePeriod=120 Apr 16 18:05:02.143368 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.143338 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="prom-label-proxy" containerID="cri-o://54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f" gracePeriod=120 Apr 16 18:05:02.143546 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.143513 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy-metric" containerID="cri-o://fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734" gracePeriod=120 Apr 16 18:05:02.210629 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.210602 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/kube-rbac-proxy/0.log" Apr 16 18:05:02.411328 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.411249 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/prom-label-proxy/0.log" Apr 16 18:05:02.609889 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.609860 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/kube-rbac-proxy-rules/0.log" Apr 16 18:05:02.810006 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:02.809985 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/kube-rbac-proxy-metrics/0.log" Apr 16 18:05:03.085951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.085863 2578 generic.go:358] "Generic (PLEG): container finished" podID="56e89b7d-804f-4859-824d-cca58032953e" containerID="54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f" exitCode=0 Apr 16 18:05:03.085951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.085886 2578 generic.go:358] "Generic (PLEG): container finished" podID="56e89b7d-804f-4859-824d-cca58032953e" containerID="b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196" exitCode=0 Apr 16 18:05:03.085951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.085893 2578 generic.go:358] "Generic (PLEG): container finished" podID="56e89b7d-804f-4859-824d-cca58032953e" containerID="52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f" exitCode=0 Apr 16 18:05:03.085951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.085899 2578 generic.go:358] "Generic (PLEG): container finished" podID="56e89b7d-804f-4859-824d-cca58032953e" containerID="777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648" exitCode=0 Apr 16 18:05:03.085951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.085922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerDied","Data":"54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f"} Apr 16 18:05:03.085951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.085943 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerDied","Data":"b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196"} Apr 16 18:05:03.085951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.085952 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerDied","Data":"52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f"} Apr 16 18:05:03.086506 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.085961 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerDied","Data":"777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648"} Apr 16 18:05:03.385932 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.385913 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:03.421448 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421421 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-config-out\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421607 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421460 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-tls-assets\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421607 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421484 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-main-tls\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421607 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421517 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-config-volume\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421607 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421552 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-web-config\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421607 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421582 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-metrics-client-ca\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421629 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-trusted-ca-bundle\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421660 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-main-db\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421701 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-web\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421764 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421807 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrxfx\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-kube-api-access-qrxfx\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421839 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.421866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.421868 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-cluster-tls-config\") pod \"56e89b7d-804f-4859-824d-cca58032953e\" (UID: \"56e89b7d-804f-4859-824d-cca58032953e\") " Apr 16 18:05:03.422796 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.422279 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:03.422796 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.422535 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:03.423064 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.423037 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:05:03.424412 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.424386 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-config-volume" (OuterVolumeSpecName: "config-volume") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:03.425277 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.425124 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:03.425277 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.425232 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:03.425522 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.425490 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:03.425833 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.425800 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:03.426419 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.426392 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:03.427099 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.427076 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-config-out" (OuterVolumeSpecName: "config-out") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:05:03.427387 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.427330 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-kube-api-access-qrxfx" (OuterVolumeSpecName: "kube-api-access-qrxfx") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "kube-api-access-qrxfx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:03.430858 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.430836 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:03.439345 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.439319 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-web-config" (OuterVolumeSpecName: "web-config") pod "56e89b7d-804f-4859-824d-cca58032953e" (UID: "56e89b7d-804f-4859-824d-cca58032953e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:03.523013 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.522984 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523013 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523009 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-cluster-tls-config\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523013 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523021 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-config-out\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523030 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-tls-assets\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523039 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-main-tls\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523047 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-config-volume\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523056 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-web-config\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523065 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-metrics-client-ca\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523075 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523084 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/56e89b7d-804f-4859-824d-cca58032953e-alertmanager-main-db\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523092 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523102 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/56e89b7d-804f-4859-824d-cca58032953e-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.523242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.523110 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qrxfx\" (UniqueName: \"kubernetes.io/projected/56e89b7d-804f-4859-824d-cca58032953e-kube-api-access-qrxfx\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:03.610833 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:03.610807 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b49d59ddb-ddqd9_738fb93b-b947-4be6-8ec1-29fdc78da521/console/0.log" Apr 16 18:05:04.090817 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.090782 2578 generic.go:358] "Generic (PLEG): container finished" podID="56e89b7d-804f-4859-824d-cca58032953e" containerID="fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734" exitCode=0 Apr 16 18:05:04.090817 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.090812 2578 generic.go:358] "Generic (PLEG): container finished" podID="56e89b7d-804f-4859-824d-cca58032953e" containerID="812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b" exitCode=0 Apr 16 18:05:04.091338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.090876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerDied","Data":"fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734"} Apr 16 18:05:04.091338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.090913 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerDied","Data":"812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b"} Apr 16 18:05:04.091338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.090924 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"56e89b7d-804f-4859-824d-cca58032953e","Type":"ContainerDied","Data":"ac0d704ff6f33ed47fa30c9b0d627ba93ba3428a3781e8436095df2abd284bc9"} Apr 16 18:05:04.091338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.090939 2578 scope.go:117] "RemoveContainer" containerID="54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f" Apr 16 18:05:04.091338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.090884 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.099039 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.098951 2578 scope.go:117] "RemoveContainer" containerID="fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734" Apr 16 18:05:04.106843 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.106826 2578 scope.go:117] "RemoveContainer" containerID="b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196" Apr 16 18:05:04.113047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.113023 2578 scope.go:117] "RemoveContainer" containerID="812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b" Apr 16 18:05:04.116579 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.116542 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:04.120237 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.120216 2578 scope.go:117] "RemoveContainer" containerID="52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f" Apr 16 18:05:04.124954 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.124938 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:04.127627 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.127607 2578 scope.go:117] "RemoveContainer" containerID="777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648" Apr 16 18:05:04.133927 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.133914 2578 scope.go:117] "RemoveContainer" containerID="4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552" Apr 16 18:05:04.140538 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.140521 2578 scope.go:117] "RemoveContainer" containerID="54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f" Apr 16 18:05:04.140781 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:04.140764 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f\": container with ID starting with 54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f not found: ID does not exist" containerID="54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f" Apr 16 18:05:04.140828 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.140789 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f"} err="failed to get container status \"54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f\": rpc error: code = NotFound desc = could not find container \"54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f\": container with ID starting with 54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f not found: ID does not exist" Apr 16 18:05:04.140828 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.140807 2578 scope.go:117] "RemoveContainer" containerID="fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734" Apr 16 18:05:04.141037 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:04.141018 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734\": container with ID starting with fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734 not found: ID does not exist" containerID="fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734" Apr 16 18:05:04.141099 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.141048 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734"} err="failed to get container status \"fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734\": rpc error: code = NotFound desc = could not find container \"fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734\": container with ID starting with fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734 not found: ID does not exist" Apr 16 18:05:04.141099 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.141073 2578 scope.go:117] "RemoveContainer" containerID="b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196" Apr 16 18:05:04.141321 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:04.141305 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196\": container with ID starting with b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196 not found: ID does not exist" containerID="b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196" Apr 16 18:05:04.141374 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.141328 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196"} err="failed to get container status \"b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196\": rpc error: code = NotFound desc = could not find container \"b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196\": container with ID starting with b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196 not found: ID does not exist" Apr 16 18:05:04.141374 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.141343 2578 scope.go:117] "RemoveContainer" containerID="812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b" Apr 16 18:05:04.141582 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:04.141565 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b\": container with ID starting with 812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b not found: ID does not exist" containerID="812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b" Apr 16 18:05:04.141624 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.141586 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b"} err="failed to get container status \"812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b\": rpc error: code = NotFound desc = could not find container \"812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b\": container with ID starting with 812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b not found: ID does not exist" Apr 16 18:05:04.141624 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.141601 2578 scope.go:117] "RemoveContainer" containerID="52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f" Apr 16 18:05:04.141809 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:04.141792 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f\": container with ID starting with 52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f not found: ID does not exist" containerID="52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f" Apr 16 18:05:04.141857 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.141813 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f"} err="failed to get container status \"52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f\": rpc error: code = NotFound desc = could not find container \"52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f\": container with ID starting with 52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f not found: ID does not exist" Apr 16 18:05:04.141857 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.141828 2578 scope.go:117] "RemoveContainer" containerID="777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648" Apr 16 18:05:04.142057 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:04.142040 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648\": container with ID starting with 777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648 not found: ID does not exist" containerID="777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648" Apr 16 18:05:04.142115 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.142064 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648"} err="failed to get container status \"777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648\": rpc error: code = NotFound desc = could not find container \"777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648\": container with ID starting with 777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648 not found: ID does not exist" Apr 16 18:05:04.142115 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.142083 2578 scope.go:117] "RemoveContainer" containerID="4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552" Apr 16 18:05:04.142325 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:04.142309 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552\": container with ID starting with 4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552 not found: ID does not exist" containerID="4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552" Apr 16 18:05:04.142374 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.142328 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552"} err="failed to get container status \"4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552\": rpc error: code = NotFound desc = could not find container \"4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552\": container with ID starting with 4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552 not found: ID does not exist" Apr 16 18:05:04.142374 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.142341 2578 scope.go:117] "RemoveContainer" containerID="54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f" Apr 16 18:05:04.142538 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.142521 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f"} err="failed to get container status \"54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f\": rpc error: code = NotFound desc = could not find container \"54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f\": container with ID starting with 54995acef9141eb99f9b343b85889016c9ad7f819af9756539c28592b225309f not found: ID does not exist" Apr 16 18:05:04.142684 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.142539 2578 scope.go:117] "RemoveContainer" containerID="fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734" Apr 16 18:05:04.142766 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.142734 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734"} err="failed to get container status \"fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734\": rpc error: code = NotFound desc = could not find container \"fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734\": container with ID starting with fa504ddea4a8b735c4da2cabf7427db4ae2cdec508ccc34673bc96e5e26aa734 not found: ID does not exist" Apr 16 18:05:04.142766 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.142756 2578 scope.go:117] "RemoveContainer" containerID="b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196" Apr 16 18:05:04.143104 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.143078 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196"} err="failed to get container status \"b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196\": rpc error: code = NotFound desc = could not find container \"b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196\": container with ID starting with b1b7e5e94f3c5e9618acb2bb09af7150f760cca26ddfd625d297432c96e36196 not found: ID does not exist" Apr 16 18:05:04.143222 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.143105 2578 scope.go:117] "RemoveContainer" containerID="812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b" Apr 16 18:05:04.143579 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.143557 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b"} err="failed to get container status \"812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b\": rpc error: code = NotFound desc = could not find container \"812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b\": container with ID starting with 812959a91bcec9fecd21d675225d5bee49dc9c36d670a44ed43d08c501b6dd7b not found: ID does not exist" Apr 16 18:05:04.143632 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.143580 2578 scope.go:117] "RemoveContainer" containerID="52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f" Apr 16 18:05:04.143829 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.143811 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f"} err="failed to get container status \"52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f\": rpc error: code = NotFound desc = could not find container \"52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f\": container with ID starting with 52de1813ba6e4e1a952879b7047c144de35c3c1301545d280402bac4af45784f not found: ID does not exist" Apr 16 18:05:04.143872 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.143830 2578 scope.go:117] "RemoveContainer" containerID="777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648" Apr 16 18:05:04.144015 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.143999 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648"} err="failed to get container status \"777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648\": rpc error: code = NotFound desc = could not find container \"777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648\": container with ID starting with 777276d1cfb0002930432d03d5b3288d217b53114b916fb08a0c05e452f72648 not found: ID does not exist" Apr 16 18:05:04.144056 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.144015 2578 scope.go:117] "RemoveContainer" containerID="4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552" Apr 16 18:05:04.144219 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.144180 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552"} err="failed to get container status \"4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552\": rpc error: code = NotFound desc = could not find container \"4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552\": container with ID starting with 4bf1dbee739168cf810bcde8c5a187e529dc27491efe0e848ad90ea63ce3f552 not found: ID does not exist" Apr 16 18:05:04.152410 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152391 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:04.152688 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152676 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy" Apr 16 18:05:04.152735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152690 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy" Apr 16 18:05:04.152735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152700 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="prom-label-proxy" Apr 16 18:05:04.152735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152706 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="prom-label-proxy" Apr 16 18:05:04.152735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152714 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy-web" Apr 16 18:05:04.152735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152720 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy-web" Apr 16 18:05:04.152735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152732 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="config-reloader" Apr 16 18:05:04.152735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152737 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="config-reloader" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152743 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04a3b62b-74ad-4f16-92c3-0d072a1996c1" containerName="console" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152748 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a3b62b-74ad-4f16-92c3-0d072a1996c1" containerName="console" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152758 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="init-config-reloader" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152763 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="init-config-reloader" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152768 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="alertmanager" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152772 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="alertmanager" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152779 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy-metric" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152784 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy-metric" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152828 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="alertmanager" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152834 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy-web" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152841 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152846 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="04a3b62b-74ad-4f16-92c3-0d072a1996c1" containerName="console" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152854 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="kube-rbac-proxy-metric" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152861 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="config-reloader" Apr 16 18:05:04.152949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.152868 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="56e89b7d-804f-4859-824d-cca58032953e" containerName="prom-label-proxy" Apr 16 18:05:04.156252 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.156237 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.158151 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.158135 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:05:04.158645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.158488 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:05:04.158645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.158496 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:05:04.158645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.158516 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:05:04.158645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.158532 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:05:04.158645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.158557 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:05:04.158645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.158571 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:05:04.158645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.158498 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-c4jn7\"" Apr 16 18:05:04.158645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.158624 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:05:04.164744 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.164725 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:05:04.169851 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.169835 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:04.228923 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.228899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229045 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.228931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2d4f\" (UniqueName: \"kubernetes.io/projected/17e65dd5-ef7e-42a0-a302-3785c49c48ab-kube-api-access-s2d4f\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229045 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.228950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/17e65dd5-ef7e-42a0-a302-3785c49c48ab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229045 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.228976 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229045 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.229035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/17e65dd5-ef7e-42a0-a302-3785c49c48ab-config-out\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229168 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.229060 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/17e65dd5-ef7e-42a0-a302-3785c49c48ab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229168 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.229080 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17e65dd5-ef7e-42a0-a302-3785c49c48ab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229168 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.229097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229277 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.229167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-web-config\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229277 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.229223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-config-volume\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229277 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.229250 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229365 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.229291 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.229365 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.229316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17e65dd5-ef7e-42a0-a302-3785c49c48ab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330211 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-web-config\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330211 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-config-volume\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330388 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330388 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330388 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17e65dd5-ef7e-42a0-a302-3785c49c48ab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330388 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330388 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2d4f\" (UniqueName: \"kubernetes.io/projected/17e65dd5-ef7e-42a0-a302-3785c49c48ab-kube-api-access-s2d4f\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330388 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/17e65dd5-ef7e-42a0-a302-3785c49c48ab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330826 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330902 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/17e65dd5-ef7e-42a0-a302-3785c49c48ab-config-out\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.330902 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/17e65dd5-ef7e-42a0-a302-3785c49c48ab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.331002 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17e65dd5-ef7e-42a0-a302-3785c49c48ab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.331002 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.330938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.331132 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.331111 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17e65dd5-ef7e-42a0-a302-3785c49c48ab-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.333532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.333324 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-web-config\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.333532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.333324 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-config-volume\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.333532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.333469 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.333532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.333475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.333532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.333501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.333814 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.333582 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/17e65dd5-ef7e-42a0-a302-3785c49c48ab-config-out\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.333814 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.333614 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/17e65dd5-ef7e-42a0-a302-3785c49c48ab-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.333814 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.333659 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.333961 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.333946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/17e65dd5-ef7e-42a0-a302-3785c49c48ab-tls-assets\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.334113 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.334092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/17e65dd5-ef7e-42a0-a302-3785c49c48ab-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.335298 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.335279 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/17e65dd5-ef7e-42a0-a302-3785c49c48ab-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.338397 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.338379 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2d4f\" (UniqueName: \"kubernetes.io/projected/17e65dd5-ef7e-42a0-a302-3785c49c48ab-kube-api-access-s2d4f\") pod \"alertmanager-main-0\" (UID: \"17e65dd5-ef7e-42a0-a302-3785c49c48ab\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.410757 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.410691 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-79ghb_c118da2f-c36c-4fca-859c-34ed40076370/serve-healthcheck-canary/0.log" Apr 16 18:05:04.466135 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.466106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:05:04.595252 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:04.595229 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:05:04.597745 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:05:04.597721 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17e65dd5_ef7e_42a0_a302_3785c49c48ab.slice/crio-12f66876c40c0b820f1d65a2d14b123dd850f4814e193a826bfe89ad6c125e75 WatchSource:0}: Error finding container 12f66876c40c0b820f1d65a2d14b123dd850f4814e193a826bfe89ad6c125e75: Status 404 returned error can't find the container with id 12f66876c40c0b820f1d65a2d14b123dd850f4814e193a826bfe89ad6c125e75 Apr 16 18:05:05.095965 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:05.095926 2578 generic.go:358] "Generic (PLEG): container finished" podID="17e65dd5-ef7e-42a0-a302-3785c49c48ab" containerID="15a268122b17ae98a684f1caf7df545020df16397a304b88f2809ebd08c98e0b" exitCode=0 Apr 16 18:05:05.096538 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:05.095978 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"17e65dd5-ef7e-42a0-a302-3785c49c48ab","Type":"ContainerDied","Data":"15a268122b17ae98a684f1caf7df545020df16397a304b88f2809ebd08c98e0b"} Apr 16 18:05:05.096538 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:05.096007 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"17e65dd5-ef7e-42a0-a302-3785c49c48ab","Type":"ContainerStarted","Data":"12f66876c40c0b820f1d65a2d14b123dd850f4814e193a826bfe89ad6c125e75"} Apr 16 18:05:05.649264 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:05.649162 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e89b7d-804f-4859-824d-cca58032953e" path="/var/lib/kubelet/pods/56e89b7d-804f-4859-824d-cca58032953e/volumes" Apr 16 18:05:06.103362 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.103329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"17e65dd5-ef7e-42a0-a302-3785c49c48ab","Type":"ContainerStarted","Data":"f600ced3487c45f97763e7f167b326f3a9c09fe83e1c6b7f5231c53e763f4e4c"} Apr 16 18:05:06.103709 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.103369 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"17e65dd5-ef7e-42a0-a302-3785c49c48ab","Type":"ContainerStarted","Data":"cd06341a8efa45cf031fc676350d0099e9621a3a147c10be625a4fa04eb6ebb3"} Apr 16 18:05:06.103709 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.103380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"17e65dd5-ef7e-42a0-a302-3785c49c48ab","Type":"ContainerStarted","Data":"c5f7eb87069265092ba5b6774306f570c2fd947c4e8db6cbc8834fd232603d1c"} Apr 16 18:05:06.103709 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.103389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"17e65dd5-ef7e-42a0-a302-3785c49c48ab","Type":"ContainerStarted","Data":"75d1450db377daacf42026dd08679afc1bc2dd8102213c9708bbe017db441704"} Apr 16 18:05:06.103709 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.103397 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"17e65dd5-ef7e-42a0-a302-3785c49c48ab","Type":"ContainerStarted","Data":"5491337bab658406fcc6a642c81d7f39f62e7e37eaacdc5916d897748bec47ec"} Apr 16 18:05:06.103709 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.103406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"17e65dd5-ef7e-42a0-a302-3785c49c48ab","Type":"ContainerStarted","Data":"989351bed316af5e6bdbcb36e9295dc51a9199b683fa1a53c343f1bab58995ea"} Apr 16 18:05:06.141296 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.141234 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.141168135 podStartE2EDuration="2.141168135s" podCreationTimestamp="2026-04-16 18:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:05:06.138474278 +0000 UTC m=+101.082087374" watchObservedRunningTime="2026-04-16 18:05:06.141168135 +0000 UTC m=+101.084781191" Apr 16 18:05:06.223291 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.223256 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-684bc67856-gnfsf"] Apr 16 18:05:06.226891 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.226865 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.229658 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.229534 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:05:06.229658 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.229552 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-svsd2\"" Apr 16 18:05:06.229836 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.229693 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:05:06.229893 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.229880 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:05:06.229951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.229880 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:05:06.229999 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.229981 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:05:06.237166 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.237143 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:05:06.247264 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.244710 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-684bc67856-gnfsf"] Apr 16 18:05:06.351182 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.351139 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e82dd6c-3235-423d-a5db-7a937f30e2ed-metrics-client-ca\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.351380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.351228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e82dd6c-3235-423d-a5db-7a937f30e2ed-telemeter-trusted-ca-bundle\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.351380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.351269 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-federate-client-tls\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.351380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.351312 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.351380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.351345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k64xr\" (UniqueName: \"kubernetes.io/projected/3e82dd6c-3235-423d-a5db-7a937f30e2ed-kube-api-access-k64xr\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.351380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.351379 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-secret-telemeter-client\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.351678 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.351423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e82dd6c-3235-423d-a5db-7a937f30e2ed-serving-certs-ca-bundle\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.351678 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.351456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-telemeter-client-tls\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.452778 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.452739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.453032 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.453012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k64xr\" (UniqueName: \"kubernetes.io/projected/3e82dd6c-3235-423d-a5db-7a937f30e2ed-kube-api-access-k64xr\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.453165 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.453151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-secret-telemeter-client\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.453333 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.453316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e82dd6c-3235-423d-a5db-7a937f30e2ed-serving-certs-ca-bundle\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.453457 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.453441 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-telemeter-client-tls\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.453602 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.453588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e82dd6c-3235-423d-a5db-7a937f30e2ed-metrics-client-ca\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.453734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.453716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e82dd6c-3235-423d-a5db-7a937f30e2ed-telemeter-trusted-ca-bundle\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.453852 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.453838 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-federate-client-tls\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.454395 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.454364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e82dd6c-3235-423d-a5db-7a937f30e2ed-serving-certs-ca-bundle\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.455356 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.455331 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e82dd6c-3235-423d-a5db-7a937f30e2ed-metrics-client-ca\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.456693 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.456667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e82dd6c-3235-423d-a5db-7a937f30e2ed-telemeter-trusted-ca-bundle\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.459645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.458911 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-federate-client-tls\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.459645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.459556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-secret-telemeter-client\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.460303 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.460268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.461030 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.461007 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/3e82dd6c-3235-423d-a5db-7a937f30e2ed-telemeter-client-tls\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.462278 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.462257 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k64xr\" (UniqueName: \"kubernetes.io/projected/3e82dd6c-3235-423d-a5db-7a937f30e2ed-kube-api-access-k64xr\") pod \"telemeter-client-684bc67856-gnfsf\" (UID: \"3e82dd6c-3235-423d-a5db-7a937f30e2ed\") " pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.538428 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.538399 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" Apr 16 18:05:06.585606 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.585414 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:06.586515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.585973 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="prometheus" containerID="cri-o://5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" gracePeriod=600 Apr 16 18:05:06.586515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.586152 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy-thanos" containerID="cri-o://03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" gracePeriod=600 Apr 16 18:05:06.586515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.586236 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="thanos-sidecar" containerID="cri-o://a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" gracePeriod=600 Apr 16 18:05:06.586515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.586270 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="config-reloader" containerID="cri-o://0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" gracePeriod=600 Apr 16 18:05:06.586515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.586308 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy" containerID="cri-o://ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" gracePeriod=600 Apr 16 18:05:06.586515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.586357 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy-web" containerID="cri-o://6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" gracePeriod=600 Apr 16 18:05:06.700952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.700919 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-684bc67856-gnfsf"] Apr 16 18:05:06.705075 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:05:06.705046 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e82dd6c_3235_423d_a5db_7a937f30e2ed.slice/crio-8e5e1a1ed8df586bd2ba4634572412e7a409126aac450d1000397d9e7070c666 WatchSource:0}: Error finding container 8e5e1a1ed8df586bd2ba4634572412e7a409126aac450d1000397d9e7070c666: Status 404 returned error can't find the container with id 8e5e1a1ed8df586bd2ba4634572412e7a409126aac450d1000397d9e7070c666 Apr 16 18:05:06.878077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.878047 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.960582 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960495 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-kubelet-serving-ca-bundle\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960582 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960547 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-config-out\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960799 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960653 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-serving-certs-ca-bundle\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960799 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960709 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v665\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-kube-api-access-8v665\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960799 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960736 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-tls-assets\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960799 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960789 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-metrics-client-certs\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960997 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960815 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-rulefiles-0\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960997 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960852 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960997 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960884 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-metrics-client-ca\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960997 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960909 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:06.960997 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960918 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-trusted-ca-bundle\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.960997 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.960986 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-config\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.961313 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.961015 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.961313 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.961058 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-grpc-tls\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.961313 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.961293 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:06.961728 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.961622 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-thanos-prometheus-http-client-file\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.961728 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.961664 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-db\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.961895 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.961729 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-tls\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.961895 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.961777 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-web-config\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.961895 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.961810 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-kube-rbac-proxy\") pod \"73a56681-2392-44d7-8a0c-d00a834d6160\" (UID: \"73a56681-2392-44d7-8a0c-d00a834d6160\") " Apr 16 18:05:06.962099 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.962083 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:06.962152 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.962102 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:06.962152 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.962100 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:06.962283 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.962258 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:06.963619 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.963591 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-config-out" (OuterVolumeSpecName: "config-out") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:05:06.963949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.963722 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:06.964638 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.964612 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:05:06.965275 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.965238 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:06.965803 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.965770 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:06.965940 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.965903 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:06.966078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.965998 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:06.966143 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.966072 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:06.966143 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.966090 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-config" (OuterVolumeSpecName: "config") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:06.966280 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.966209 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:06.966393 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.966371 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:06.966818 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.966794 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:06.967954 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.967930 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-kube-api-access-8v665" (OuterVolumeSpecName: "kube-api-access-8v665") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "kube-api-access-8v665". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:06.982283 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:06.981615 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-web-config" (OuterVolumeSpecName: "web-config") pod "73a56681-2392-44d7-8a0c-d00a834d6160" (UID: "73a56681-2392-44d7-8a0c-d00a834d6160"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:07.006372 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.006346 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9qqgk" Apr 16 18:05:07.063671 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063636 2578 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-grpc-tls\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063671 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063667 2578 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063684 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-db\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063701 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063718 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-web-config\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063732 2578 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-kube-rbac-proxy\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063746 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73a56681-2392-44d7-8a0c-d00a834d6160-config-out\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063761 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063779 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8v665\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-kube-api-access-8v665\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063793 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73a56681-2392-44d7-8a0c-d00a834d6160-tls-assets\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063807 2578 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-metrics-client-certs\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063821 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063836 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063851 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73a56681-2392-44d7-8a0c-d00a834d6160-configmap-metrics-client-ca\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063864 2578 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-config\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.063905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.063880 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73a56681-2392-44d7-8a0c-d00a834d6160-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:07.109423 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.109385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" event={"ID":"3e82dd6c-3235-423d-a5db-7a937f30e2ed","Type":"ContainerStarted","Data":"8e5e1a1ed8df586bd2ba4634572412e7a409126aac450d1000397d9e7070c666"} Apr 16 18:05:07.113395 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113354 2578 generic.go:358] "Generic (PLEG): container finished" podID="73a56681-2392-44d7-8a0c-d00a834d6160" containerID="03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" exitCode=0 Apr 16 18:05:07.113395 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113388 2578 generic.go:358] "Generic (PLEG): container finished" podID="73a56681-2392-44d7-8a0c-d00a834d6160" containerID="ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" exitCode=0 Apr 16 18:05:07.113395 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113398 2578 generic.go:358] "Generic (PLEG): container finished" podID="73a56681-2392-44d7-8a0c-d00a834d6160" containerID="6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" exitCode=0 Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113408 2578 generic.go:358] "Generic (PLEG): container finished" podID="73a56681-2392-44d7-8a0c-d00a834d6160" containerID="a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" exitCode=0 Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113419 2578 generic.go:358] "Generic (PLEG): container finished" podID="73a56681-2392-44d7-8a0c-d00a834d6160" containerID="0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" exitCode=0 Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113429 2578 generic.go:358] "Generic (PLEG): container finished" podID="73a56681-2392-44d7-8a0c-d00a834d6160" containerID="5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" exitCode=0 Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113440 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerDied","Data":"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d"} Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113487 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerDied","Data":"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b"} Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerDied","Data":"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91"} Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerDied","Data":"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd"} Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerDied","Data":"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b"} Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113535 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerDied","Data":"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67"} Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113550 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73a56681-2392-44d7-8a0c-d00a834d6160","Type":"ContainerDied","Data":"cc2f9d39c7ba79b73d4abb18e993b447505c59ce4318015ba698f0873a43b3ba"} Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113535 2578 scope.go:117] "RemoveContainer" containerID="03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" Apr 16 18:05:07.113639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.113519 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.123660 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.123632 2578 scope.go:117] "RemoveContainer" containerID="ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" Apr 16 18:05:07.132260 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.132242 2578 scope.go:117] "RemoveContainer" containerID="6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" Apr 16 18:05:07.140295 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.140275 2578 scope.go:117] "RemoveContainer" containerID="a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" Apr 16 18:05:07.142240 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.141345 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:07.149249 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.149226 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:07.150330 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.150249 2578 scope.go:117] "RemoveContainer" containerID="0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" Apr 16 18:05:07.164467 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.162182 2578 scope.go:117] "RemoveContainer" containerID="5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" Apr 16 18:05:07.172306 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.172274 2578 scope.go:117] "RemoveContainer" containerID="93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180420 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180816 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy-web" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180831 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy-web" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180843 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="init-config-reloader" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180851 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="init-config-reloader" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180861 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="config-reloader" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180869 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="config-reloader" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180889 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy-thanos" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180897 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy-thanos" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180910 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180918 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180932 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="prometheus" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180940 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="prometheus" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180947 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="thanos-sidecar" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.180956 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="thanos-sidecar" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.181016 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="thanos-sidecar" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.181027 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="config-reloader" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.181037 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy-thanos" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.181049 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy-web" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.181060 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="kube-rbac-proxy" Apr 16 18:05:07.182721 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.181070 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" containerName="prometheus" Apr 16 18:05:07.184058 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.183975 2578 scope.go:117] "RemoveContainer" containerID="03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" Apr 16 18:05:07.184411 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:07.184335 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": container with ID starting with 03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d not found: ID does not exist" containerID="03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" Apr 16 18:05:07.184411 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.184373 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d"} err="failed to get container status \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": rpc error: code = NotFound desc = could not find container \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": container with ID starting with 03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d not found: ID does not exist" Apr 16 18:05:07.184411 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.184399 2578 scope.go:117] "RemoveContainer" containerID="ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" Apr 16 18:05:07.184774 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:07.184745 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": container with ID starting with ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b not found: ID does not exist" containerID="ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" Apr 16 18:05:07.184851 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.184789 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b"} err="failed to get container status \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": rpc error: code = NotFound desc = could not find container \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": container with ID starting with ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b not found: ID does not exist" Apr 16 18:05:07.184851 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.184817 2578 scope.go:117] "RemoveContainer" containerID="6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" Apr 16 18:05:07.185432 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:07.185335 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": container with ID starting with 6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91 not found: ID does not exist" containerID="6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" Apr 16 18:05:07.185432 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.185366 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91"} err="failed to get container status \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": rpc error: code = NotFound desc = could not find container \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": container with ID starting with 6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91 not found: ID does not exist" Apr 16 18:05:07.185432 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.185383 2578 scope.go:117] "RemoveContainer" containerID="a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" Apr 16 18:05:07.186434 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:07.186409 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": container with ID starting with a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd not found: ID does not exist" containerID="a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" Apr 16 18:05:07.186533 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.186439 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd"} err="failed to get container status \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": rpc error: code = NotFound desc = could not find container \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": container with ID starting with a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd not found: ID does not exist" Apr 16 18:05:07.186533 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.186459 2578 scope.go:117] "RemoveContainer" containerID="0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" Apr 16 18:05:07.186695 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.186574 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.187011 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:07.186817 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": container with ID starting with 0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b not found: ID does not exist" containerID="0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" Apr 16 18:05:07.187011 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.186858 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b"} err="failed to get container status \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": rpc error: code = NotFound desc = could not find container \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": container with ID starting with 0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b not found: ID does not exist" Apr 16 18:05:07.187011 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.186876 2578 scope.go:117] "RemoveContainer" containerID="5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" Apr 16 18:05:07.187409 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:07.187389 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": container with ID starting with 5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67 not found: ID does not exist" containerID="5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" Apr 16 18:05:07.187461 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.187413 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67"} err="failed to get container status \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": rpc error: code = NotFound desc = could not find container \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": container with ID starting with 5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67 not found: ID does not exist" Apr 16 18:05:07.187461 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.187428 2578 scope.go:117] "RemoveContainer" containerID="93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34" Apr 16 18:05:07.187791 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:07.187770 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": container with ID starting with 93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34 not found: ID does not exist" containerID="93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34" Apr 16 18:05:07.187837 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.187799 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34"} err="failed to get container status \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": rpc error: code = NotFound desc = could not find container \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": container with ID starting with 93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34 not found: ID does not exist" Apr 16 18:05:07.187837 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.187813 2578 scope.go:117] "RemoveContainer" containerID="03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" Apr 16 18:05:07.188079 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.188055 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d"} err="failed to get container status \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": rpc error: code = NotFound desc = could not find container \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": container with ID starting with 03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d not found: ID does not exist" Apr 16 18:05:07.188118 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.188083 2578 scope.go:117] "RemoveContainer" containerID="ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" Apr 16 18:05:07.188433 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.188390 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b"} err="failed to get container status \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": rpc error: code = NotFound desc = could not find container \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": container with ID starting with ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b not found: ID does not exist" Apr 16 18:05:07.188506 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.188433 2578 scope.go:117] "RemoveContainer" containerID="6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" Apr 16 18:05:07.188828 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.188810 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:05:07.188889 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.188839 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91"} err="failed to get container status \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": rpc error: code = NotFound desc = could not find container \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": container with ID starting with 6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91 not found: ID does not exist" Apr 16 18:05:07.188889 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.188864 2578 scope.go:117] "RemoveContainer" containerID="a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" Apr 16 18:05:07.188981 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.188815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:05:07.189036 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189009 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-25p8q\"" Apr 16 18:05:07.189036 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189025 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:05:07.189150 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189126 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd"} err="failed to get container status \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": rpc error: code = NotFound desc = could not find container \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": container with ID starting with a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd not found: ID does not exist" Apr 16 18:05:07.189227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189153 2578 scope.go:117] "RemoveContainer" containerID="0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" Apr 16 18:05:07.189286 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189256 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:05:07.189339 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189305 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:05:07.189489 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189462 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b"} err="failed to get container status \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": rpc error: code = NotFound desc = could not find container \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": container with ID starting with 0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b not found: ID does not exist" Apr 16 18:05:07.189540 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189493 2578 scope.go:117] "RemoveContainer" containerID="5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" Apr 16 18:05:07.189944 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189883 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67"} err="failed to get container status \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": rpc error: code = NotFound desc = could not find container \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": container with ID starting with 5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67 not found: ID does not exist" Apr 16 18:05:07.189944 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.189909 2578 scope.go:117] "RemoveContainer" containerID="93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34" Apr 16 18:05:07.190333 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.190309 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34"} err="failed to get container status \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": rpc error: code = NotFound desc = could not find container \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": container with ID starting with 93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34 not found: ID does not exist" Apr 16 18:05:07.190399 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.190337 2578 scope.go:117] "RemoveContainer" containerID="03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" Apr 16 18:05:07.190660 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.190632 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d"} err="failed to get container status \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": rpc error: code = NotFound desc = could not find container \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": container with ID starting with 03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d not found: ID does not exist" Apr 16 18:05:07.190745 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.190674 2578 scope.go:117] "RemoveContainer" containerID="ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" Apr 16 18:05:07.191235 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.191215 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b"} err="failed to get container status \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": rpc error: code = NotFound desc = could not find container \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": container with ID starting with ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b not found: ID does not exist" Apr 16 18:05:07.191327 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.191236 2578 scope.go:117] "RemoveContainer" containerID="6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" Apr 16 18:05:07.191432 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.191415 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8dbtl4qhbmqub\"" Apr 16 18:05:07.191629 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.191610 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:05:07.191798 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.191776 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:05:07.192166 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.192140 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91"} err="failed to get container status \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": rpc error: code = NotFound desc = could not find container \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": container with ID starting with 6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91 not found: ID does not exist" Apr 16 18:05:07.192166 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.192166 2578 scope.go:117] "RemoveContainer" containerID="a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" Apr 16 18:05:07.192332 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.192279 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:05:07.192906 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.192883 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd"} err="failed to get container status \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": rpc error: code = NotFound desc = could not find container \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": container with ID starting with a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd not found: ID does not exist" Apr 16 18:05:07.193019 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.192910 2578 scope.go:117] "RemoveContainer" containerID="0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" Apr 16 18:05:07.193439 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.193134 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b"} err="failed to get container status \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": rpc error: code = NotFound desc = could not find container \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": container with ID starting with 0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b not found: ID does not exist" Apr 16 18:05:07.193439 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.193166 2578 scope.go:117] "RemoveContainer" containerID="5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" Apr 16 18:05:07.193880 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.193789 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67"} err="failed to get container status \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": rpc error: code = NotFound desc = could not find container \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": container with ID starting with 5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67 not found: ID does not exist" Apr 16 18:05:07.193880 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.193812 2578 scope.go:117] "RemoveContainer" containerID="93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34" Apr 16 18:05:07.194424 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.194245 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:05:07.194619 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.194448 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34"} err="failed to get container status \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": rpc error: code = NotFound desc = could not find container \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": container with ID starting with 93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34 not found: ID does not exist" Apr 16 18:05:07.194619 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.194473 2578 scope.go:117] "RemoveContainer" containerID="03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" Apr 16 18:05:07.195007 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.194882 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:05:07.195106 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.195089 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:05:07.195853 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.195330 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d"} err="failed to get container status \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": rpc error: code = NotFound desc = could not find container \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": container with ID starting with 03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d not found: ID does not exist" Apr 16 18:05:07.195853 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.195360 2578 scope.go:117] "RemoveContainer" containerID="ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" Apr 16 18:05:07.195853 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.195671 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b"} err="failed to get container status \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": rpc error: code = NotFound desc = could not find container \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": container with ID starting with ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b not found: ID does not exist" Apr 16 18:05:07.195853 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.195698 2578 scope.go:117] "RemoveContainer" containerID="6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" Apr 16 18:05:07.196068 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.196039 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91"} err="failed to get container status \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": rpc error: code = NotFound desc = could not find container \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": container with ID starting with 6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91 not found: ID does not exist" Apr 16 18:05:07.196068 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.196061 2578 scope.go:117] "RemoveContainer" containerID="a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" Apr 16 18:05:07.196392 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.196363 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd"} err="failed to get container status \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": rpc error: code = NotFound desc = could not find container \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": container with ID starting with a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd not found: ID does not exist" Apr 16 18:05:07.196468 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.196394 2578 scope.go:117] "RemoveContainer" containerID="0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" Apr 16 18:05:07.196891 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.196798 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b"} err="failed to get container status \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": rpc error: code = NotFound desc = could not find container \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": container with ID starting with 0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b not found: ID does not exist" Apr 16 18:05:07.196891 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.196825 2578 scope.go:117] "RemoveContainer" containerID="5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" Apr 16 18:05:07.197124 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.197098 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67"} err="failed to get container status \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": rpc error: code = NotFound desc = could not find container \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": container with ID starting with 5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67 not found: ID does not exist" Apr 16 18:05:07.197215 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.197126 2578 scope.go:117] "RemoveContainer" containerID="93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34" Apr 16 18:05:07.197629 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.197541 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34"} err="failed to get container status \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": rpc error: code = NotFound desc = could not find container \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": container with ID starting with 93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34 not found: ID does not exist" Apr 16 18:05:07.197629 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.197566 2578 scope.go:117] "RemoveContainer" containerID="03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" Apr 16 18:05:07.198305 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.198280 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d"} err="failed to get container status \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": rpc error: code = NotFound desc = could not find container \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": container with ID starting with 03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d not found: ID does not exist" Apr 16 18:05:07.198395 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.198307 2578 scope.go:117] "RemoveContainer" containerID="ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" Apr 16 18:05:07.198621 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.198600 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b"} err="failed to get container status \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": rpc error: code = NotFound desc = could not find container \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": container with ID starting with ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b not found: ID does not exist" Apr 16 18:05:07.198695 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.198623 2578 scope.go:117] "RemoveContainer" containerID="6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" Apr 16 18:05:07.198749 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.198693 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:05:07.198883 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.198864 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91"} err="failed to get container status \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": rpc error: code = NotFound desc = could not find container \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": container with ID starting with 6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91 not found: ID does not exist" Apr 16 18:05:07.198952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.198883 2578 scope.go:117] "RemoveContainer" containerID="a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" Apr 16 18:05:07.199102 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.199078 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd"} err="failed to get container status \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": rpc error: code = NotFound desc = could not find container \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": container with ID starting with a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd not found: ID does not exist" Apr 16 18:05:07.199164 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.199105 2578 scope.go:117] "RemoveContainer" containerID="0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" Apr 16 18:05:07.199366 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.199339 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b"} err="failed to get container status \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": rpc error: code = NotFound desc = could not find container \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": container with ID starting with 0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b not found: ID does not exist" Apr 16 18:05:07.199456 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.199367 2578 scope.go:117] "RemoveContainer" containerID="5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" Apr 16 18:05:07.199662 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.199637 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67"} err="failed to get container status \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": rpc error: code = NotFound desc = could not find container \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": container with ID starting with 5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67 not found: ID does not exist" Apr 16 18:05:07.199662 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.199663 2578 scope.go:117] "RemoveContainer" containerID="93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34" Apr 16 18:05:07.199953 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.199928 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34"} err="failed to get container status \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": rpc error: code = NotFound desc = could not find container \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": container with ID starting with 93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34 not found: ID does not exist" Apr 16 18:05:07.200033 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.199954 2578 scope.go:117] "RemoveContainer" containerID="03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d" Apr 16 18:05:07.200329 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.200292 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d"} err="failed to get container status \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": rpc error: code = NotFound desc = could not find container \"03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d\": container with ID starting with 03acb25c071466a7c0e2e47281bb0b17734b8c7685d8a5fec4e6266e37ca5d2d not found: ID does not exist" Apr 16 18:05:07.200329 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.200316 2578 scope.go:117] "RemoveContainer" containerID="ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b" Apr 16 18:05:07.200612 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.200592 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b"} err="failed to get container status \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": rpc error: code = NotFound desc = could not find container \"ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b\": container with ID starting with ee8454e2fcd58bbaede3a256117e65915e4538e405999dc72e73e5ad1725100b not found: ID does not exist" Apr 16 18:05:07.200612 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.200614 2578 scope.go:117] "RemoveContainer" containerID="6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91" Apr 16 18:05:07.201071 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.200839 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91"} err="failed to get container status \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": rpc error: code = NotFound desc = could not find container \"6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91\": container with ID starting with 6e253a0fe0b03ce9180f04d81558979acb497bf593a1e6a80d8cc4ea580dbc91 not found: ID does not exist" Apr 16 18:05:07.201071 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.200858 2578 scope.go:117] "RemoveContainer" containerID="a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd" Apr 16 18:05:07.201227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.201119 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd"} err="failed to get container status \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": rpc error: code = NotFound desc = could not find container \"a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd\": container with ID starting with a217b4f3a27c5501ce2afc7f76c8b5f80473e1445f1c99209d58f9a45b28a0dd not found: ID does not exist" Apr 16 18:05:07.201227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.201136 2578 scope.go:117] "RemoveContainer" containerID="0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b" Apr 16 18:05:07.201651 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.201618 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b"} err="failed to get container status \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": rpc error: code = NotFound desc = could not find container \"0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b\": container with ID starting with 0754fac7a3ab0149d454efcedb66df36ba1bd909c34a8ec827f459e698f8ad3b not found: ID does not exist" Apr 16 18:05:07.201651 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.201643 2578 scope.go:117] "RemoveContainer" containerID="5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67" Apr 16 18:05:07.201938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.201912 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67"} err="failed to get container status \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": rpc error: code = NotFound desc = could not find container \"5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67\": container with ID starting with 5041872a8d253e29248ee40dacabb0a78bcdecb2c92c72cbd708e09fbaab3c67 not found: ID does not exist" Apr 16 18:05:07.202012 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.201941 2578 scope.go:117] "RemoveContainer" containerID="93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34" Apr 16 18:05:07.202181 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.202154 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34"} err="failed to get container status \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": rpc error: code = NotFound desc = could not find container \"93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34\": container with ID starting with 93ab11a25abb89bba26b3d122a6d2cf2ff76ee11e8ce76cfd812b138b7bcbf34 not found: ID does not exist" Apr 16 18:05:07.206232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.206106 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:07.268148 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268061 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268318 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268318 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268258 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-config\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268318 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-web-config\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268327 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca490224-c264-49dc-b0ac-bb60473d99d5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268378 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268445 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268519 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268608 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268652 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268682 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch6wt\" (UniqueName: \"kubernetes.io/projected/ca490224-c264-49dc-b0ac-bb60473d99d5-kube-api-access-ch6wt\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.268734 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268704 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca490224-c264-49dc-b0ac-bb60473d99d5-config-out\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.269060 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.269060 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.268772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ca490224-c264-49dc-b0ac-bb60473d99d5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.369792 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.369756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.369961 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.369808 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.369961 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.369843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-config\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.369961 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.369866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-web-config\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.369961 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.369887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca490224-c264-49dc-b0ac-bb60473d99d5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.369961 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.369923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.369961 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.369954 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.370309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.369978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.370309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.370023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.370309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.370054 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.370309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.370083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.370309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.370140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.370309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.370171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.371878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.370669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.371878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.370821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.371878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.370997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6wt\" (UniqueName: \"kubernetes.io/projected/ca490224-c264-49dc-b0ac-bb60473d99d5-kube-api-access-ch6wt\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.371878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.371052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca490224-c264-49dc-b0ac-bb60473d99d5-config-out\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.371878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.371091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.371878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.371133 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ca490224-c264-49dc-b0ac-bb60473d99d5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.371878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.371274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.371878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.371522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ca490224-c264-49dc-b0ac-bb60473d99d5-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.373283 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.373140 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.373998 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.373684 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.374434 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.374405 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.376352 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.374964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.376352 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.375260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.376352 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.375259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.376352 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.375622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-web-config\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.376352 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.376307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-config\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.377132 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.377082 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.377948 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.377698 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.377948 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.377810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca490224-c264-49dc-b0ac-bb60473d99d5-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.378625 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.378599 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ca490224-c264-49dc-b0ac-bb60473d99d5-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.378712 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.378629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ca490224-c264-49dc-b0ac-bb60473d99d5-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.379813 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.379793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca490224-c264-49dc-b0ac-bb60473d99d5-config-out\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.386419 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.386400 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6wt\" (UniqueName: \"kubernetes.io/projected/ca490224-c264-49dc-b0ac-bb60473d99d5-kube-api-access-ch6wt\") pod \"prometheus-k8s-0\" (UID: \"ca490224-c264-49dc-b0ac-bb60473d99d5\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.498568 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.498528 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:07.657534 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.656648 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a56681-2392-44d7-8a0c-d00a834d6160" path="/var/lib/kubelet/pods/73a56681-2392-44d7-8a0c-d00a834d6160/volumes" Apr 16 18:05:07.701214 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:07.695561 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:08.120043 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:08.119954 2578 generic.go:358] "Generic (PLEG): container finished" podID="ca490224-c264-49dc-b0ac-bb60473d99d5" containerID="3c598ab835e3c6df8250b2093eaa471c71fe1dad42ed285143440f5dd1a5aa36" exitCode=0 Apr 16 18:05:08.120447 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:08.120041 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ca490224-c264-49dc-b0ac-bb60473d99d5","Type":"ContainerDied","Data":"3c598ab835e3c6df8250b2093eaa471c71fe1dad42ed285143440f5dd1a5aa36"} Apr 16 18:05:08.120447 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:08.120082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ca490224-c264-49dc-b0ac-bb60473d99d5","Type":"ContainerStarted","Data":"3dbd6d7a574dabc04e714a42be485990eba95de0de0540795bf21956a3efc518"} Apr 16 18:05:09.128875 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:09.128835 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ca490224-c264-49dc-b0ac-bb60473d99d5","Type":"ContainerStarted","Data":"ba4c2ee07749c3068b4c91a79bd65b0475934eeea9c92e6cf114933ed6e0aace"} Apr 16 18:05:09.129320 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:09.128885 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ca490224-c264-49dc-b0ac-bb60473d99d5","Type":"ContainerStarted","Data":"91c5749d25bf49f76f6c9f15fc8b3ad0d626c3f51abd9c3b9e3789758045f925"} Apr 16 18:05:09.129320 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:09.128903 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ca490224-c264-49dc-b0ac-bb60473d99d5","Type":"ContainerStarted","Data":"dbf1062f2b9058526a72c6600c6f6c546f9559c0f0ad9983019d9d59304146b9"} Apr 16 18:05:09.129320 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:09.128917 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ca490224-c264-49dc-b0ac-bb60473d99d5","Type":"ContainerStarted","Data":"4e63b266cfc02929e3446bdbdf2f10f0349f6ad014b9dfcf13e70bc01ac1921f"} Apr 16 18:05:09.131232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:09.131173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" event={"ID":"3e82dd6c-3235-423d-a5db-7a937f30e2ed","Type":"ContainerStarted","Data":"bd180e89cd9ff22ddd88640e817693a3926835c5abc88c48474cac3d7da8e699"} Apr 16 18:05:09.131232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:09.131232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" event={"ID":"3e82dd6c-3235-423d-a5db-7a937f30e2ed","Type":"ContainerStarted","Data":"efd0294c4f800c214deaa06c55c6a6c3d854e806821fa0572f5bb465399fa3bf"} Apr 16 18:05:09.131416 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:09.131248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" event={"ID":"3e82dd6c-3235-423d-a5db-7a937f30e2ed","Type":"ContainerStarted","Data":"e9a7574fd2a36de205b51b0b7011d772946a104fb22e45a5befa2279da2b9bf0"} Apr 16 18:05:09.161449 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:09.160585 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-684bc67856-gnfsf" podStartSLOduration=1.191942954 podStartE2EDuration="3.160564831s" podCreationTimestamp="2026-04-16 18:05:06 +0000 UTC" firstStartedPulling="2026-04-16 18:05:06.7068386 +0000 UTC m=+101.650451634" lastFinishedPulling="2026-04-16 18:05:08.675460469 +0000 UTC m=+103.619073511" observedRunningTime="2026-04-16 18:05:09.156108536 +0000 UTC m=+104.099721593" watchObservedRunningTime="2026-04-16 18:05:09.160564831 +0000 UTC m=+104.104177889" Apr 16 18:05:10.137911 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:10.137873 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ca490224-c264-49dc-b0ac-bb60473d99d5","Type":"ContainerStarted","Data":"7c895ce3d2eac811b11747b8bffdfa135ceb790b786efb68a7e4044f294297cc"} Apr 16 18:05:10.137911 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:10.137911 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ca490224-c264-49dc-b0ac-bb60473d99d5","Type":"ContainerStarted","Data":"af9b48faafb8db0feb10e2a066b1838ff6d0fb0bbb969b6ff035c5ccd502f18d"} Apr 16 18:05:10.177085 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:10.177030 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.17701144 podStartE2EDuration="3.17701144s" podCreationTimestamp="2026-04-16 18:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:05:10.175644685 +0000 UTC m=+105.119257740" watchObservedRunningTime="2026-04-16 18:05:10.17701144 +0000 UTC m=+105.120624496" Apr 16 18:05:12.499007 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:12.498975 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:20.830841 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:20.830800 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b49d59ddb-ddqd9"] Apr 16 18:05:45.850363 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:45.850304 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b49d59ddb-ddqd9" podUID="738fb93b-b947-4be6-8ec1-29fdc78da521" containerName="console" containerID="cri-o://52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd" gracePeriod=15 Apr 16 18:05:46.100082 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.100062 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b49d59ddb-ddqd9_738fb93b-b947-4be6-8ec1-29fdc78da521/console/0.log" Apr 16 18:05:46.100226 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.100125 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:05:46.190824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.190734 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-serving-cert\") pod \"738fb93b-b947-4be6-8ec1-29fdc78da521\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " Apr 16 18:05:46.190824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.190777 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-trusted-ca-bundle\") pod \"738fb93b-b947-4be6-8ec1-29fdc78da521\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " Apr 16 18:05:46.190824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.190812 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-oauth-serving-cert\") pod \"738fb93b-b947-4be6-8ec1-29fdc78da521\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " Apr 16 18:05:46.191080 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.190855 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-console-config\") pod \"738fb93b-b947-4be6-8ec1-29fdc78da521\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " Apr 16 18:05:46.191080 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.190884 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-oauth-config\") pod \"738fb93b-b947-4be6-8ec1-29fdc78da521\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " Apr 16 18:05:46.191080 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.190915 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-service-ca\") pod \"738fb93b-b947-4be6-8ec1-29fdc78da521\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " Apr 16 18:05:46.191080 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.190995 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct6nt\" (UniqueName: \"kubernetes.io/projected/738fb93b-b947-4be6-8ec1-29fdc78da521-kube-api-access-ct6nt\") pod \"738fb93b-b947-4be6-8ec1-29fdc78da521\" (UID: \"738fb93b-b947-4be6-8ec1-29fdc78da521\") " Apr 16 18:05:46.191377 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.191348 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "738fb93b-b947-4be6-8ec1-29fdc78da521" (UID: "738fb93b-b947-4be6-8ec1-29fdc78da521"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:46.191440 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.191370 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "738fb93b-b947-4be6-8ec1-29fdc78da521" (UID: "738fb93b-b947-4be6-8ec1-29fdc78da521"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:46.191440 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.191405 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-console-config" (OuterVolumeSpecName: "console-config") pod "738fb93b-b947-4be6-8ec1-29fdc78da521" (UID: "738fb93b-b947-4be6-8ec1-29fdc78da521"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:46.191527 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.191472 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-service-ca" (OuterVolumeSpecName: "service-ca") pod "738fb93b-b947-4be6-8ec1-29fdc78da521" (UID: "738fb93b-b947-4be6-8ec1-29fdc78da521"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:46.193047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.193018 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "738fb93b-b947-4be6-8ec1-29fdc78da521" (UID: "738fb93b-b947-4be6-8ec1-29fdc78da521"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:46.193047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.193029 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "738fb93b-b947-4be6-8ec1-29fdc78da521" (UID: "738fb93b-b947-4be6-8ec1-29fdc78da521"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:46.193232 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.193134 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/738fb93b-b947-4be6-8ec1-29fdc78da521-kube-api-access-ct6nt" (OuterVolumeSpecName: "kube-api-access-ct6nt") pod "738fb93b-b947-4be6-8ec1-29fdc78da521" (UID: "738fb93b-b947-4be6-8ec1-29fdc78da521"). InnerVolumeSpecName "kube-api-access-ct6nt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:46.248849 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.248823 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b49d59ddb-ddqd9_738fb93b-b947-4be6-8ec1-29fdc78da521/console/0.log" Apr 16 18:05:46.248994 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.248859 2578 generic.go:358] "Generic (PLEG): container finished" podID="738fb93b-b947-4be6-8ec1-29fdc78da521" containerID="52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd" exitCode=2 Apr 16 18:05:46.248994 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.248890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b49d59ddb-ddqd9" event={"ID":"738fb93b-b947-4be6-8ec1-29fdc78da521","Type":"ContainerDied","Data":"52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd"} Apr 16 18:05:46.248994 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.248928 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b49d59ddb-ddqd9" event={"ID":"738fb93b-b947-4be6-8ec1-29fdc78da521","Type":"ContainerDied","Data":"58f19e437430dbbd3413a080f4970d8a1eb36b550dbba1ca6c394753d90cee80"} Apr 16 18:05:46.248994 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.248944 2578 scope.go:117] "RemoveContainer" containerID="52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd" Apr 16 18:05:46.248994 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.248958 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b49d59ddb-ddqd9" Apr 16 18:05:46.256948 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.256921 2578 scope.go:117] "RemoveContainer" containerID="52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd" Apr 16 18:05:46.257210 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:05:46.257176 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd\": container with ID starting with 52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd not found: ID does not exist" containerID="52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd" Apr 16 18:05:46.257314 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.257220 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd"} err="failed to get container status \"52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd\": rpc error: code = NotFound desc = could not find container \"52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd\": container with ID starting with 52920e2e8322b59528c743c5f388ee090e1524abf16fe7b4475d76cf3d9ebdfd not found: ID does not exist" Apr 16 18:05:46.271176 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.271146 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b49d59ddb-ddqd9"] Apr 16 18:05:46.276031 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.276012 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b49d59ddb-ddqd9"] Apr 16 18:05:46.291841 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.291818 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-oauth-serving-cert\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:46.291841 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.291842 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-console-config\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:46.291965 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.291852 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-oauth-config\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:46.291965 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.291861 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-service-ca\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:46.291965 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.291870 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ct6nt\" (UniqueName: \"kubernetes.io/projected/738fb93b-b947-4be6-8ec1-29fdc78da521-kube-api-access-ct6nt\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:46.291965 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.291878 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/738fb93b-b947-4be6-8ec1-29fdc78da521-console-serving-cert\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:46.291965 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:46.291887 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/738fb93b-b947-4be6-8ec1-29fdc78da521-trusted-ca-bundle\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:05:47.647624 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:05:47.647583 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="738fb93b-b947-4be6-8ec1-29fdc78da521" path="/var/lib/kubelet/pods/738fb93b-b947-4be6-8ec1-29fdc78da521/volumes" Apr 16 18:06:07.499408 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:06:07.499332 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:07.514348 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:06:07.514322 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:08.328806 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:06:08.328781 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:08:22.596777 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.596741 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rvmjz"] Apr 16 18:08:22.597301 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.597067 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="738fb93b-b947-4be6-8ec1-29fdc78da521" containerName="console" Apr 16 18:08:22.597301 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.597079 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="738fb93b-b947-4be6-8ec1-29fdc78da521" containerName="console" Apr 16 18:08:22.597301 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.597148 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="738fb93b-b947-4be6-8ec1-29fdc78da521" containerName="console" Apr 16 18:08:22.600002 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.599988 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.601777 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.601754 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:08:22.608640 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.608619 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rvmjz"] Apr 16 18:08:22.697428 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.697390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aaac5506-c30b-46a4-b8d2-8cffc2dc83d7-original-pull-secret\") pod \"global-pull-secret-syncer-rvmjz\" (UID: \"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7\") " pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.697614 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.697474 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aaac5506-c30b-46a4-b8d2-8cffc2dc83d7-kubelet-config\") pod \"global-pull-secret-syncer-rvmjz\" (UID: \"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7\") " pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.697614 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.697503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aaac5506-c30b-46a4-b8d2-8cffc2dc83d7-dbus\") pod \"global-pull-secret-syncer-rvmjz\" (UID: \"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7\") " pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.798091 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.798056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aaac5506-c30b-46a4-b8d2-8cffc2dc83d7-kubelet-config\") pod \"global-pull-secret-syncer-rvmjz\" (UID: \"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7\") " pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.798091 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.798093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aaac5506-c30b-46a4-b8d2-8cffc2dc83d7-dbus\") pod \"global-pull-secret-syncer-rvmjz\" (UID: \"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7\") " pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.798330 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.798142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aaac5506-c30b-46a4-b8d2-8cffc2dc83d7-original-pull-secret\") pod \"global-pull-secret-syncer-rvmjz\" (UID: \"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7\") " pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.798330 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.798186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aaac5506-c30b-46a4-b8d2-8cffc2dc83d7-kubelet-config\") pod \"global-pull-secret-syncer-rvmjz\" (UID: \"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7\") " pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.798403 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.798362 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aaac5506-c30b-46a4-b8d2-8cffc2dc83d7-dbus\") pod \"global-pull-secret-syncer-rvmjz\" (UID: \"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7\") " pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.800235 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.800210 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aaac5506-c30b-46a4-b8d2-8cffc2dc83d7-original-pull-secret\") pod \"global-pull-secret-syncer-rvmjz\" (UID: \"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7\") " pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:22.910301 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:22.910177 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rvmjz" Apr 16 18:08:23.024217 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:23.024165 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rvmjz"] Apr 16 18:08:23.027247 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:08:23.027219 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaac5506_c30b_46a4_b8d2_8cffc2dc83d7.slice/crio-0a47712152521724e1b5dd3a224526a5523f5eefd1be3821ce3fa28819fe5a88 WatchSource:0}: Error finding container 0a47712152521724e1b5dd3a224526a5523f5eefd1be3821ce3fa28819fe5a88: Status 404 returned error can't find the container with id 0a47712152521724e1b5dd3a224526a5523f5eefd1be3821ce3fa28819fe5a88 Apr 16 18:08:23.693424 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:23.693385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rvmjz" event={"ID":"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7","Type":"ContainerStarted","Data":"0a47712152521724e1b5dd3a224526a5523f5eefd1be3821ce3fa28819fe5a88"} Apr 16 18:08:26.842371 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:26.842344 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:08:27.707137 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:27.707099 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rvmjz" event={"ID":"aaac5506-c30b-46a4-b8d2-8cffc2dc83d7","Type":"ContainerStarted","Data":"e2f5d8086587d425813f54802992dbf0ed9c9e402e831bd1439013e8229fa468"} Apr 16 18:08:27.728119 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:08:27.728075 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rvmjz" podStartSLOduration=1.88203549 podStartE2EDuration="5.728060846s" podCreationTimestamp="2026-04-16 18:08:22 +0000 UTC" firstStartedPulling="2026-04-16 18:08:23.028883545 +0000 UTC m=+297.972496594" lastFinishedPulling="2026-04-16 18:08:26.874908912 +0000 UTC m=+301.818521950" observedRunningTime="2026-04-16 18:08:27.72677024 +0000 UTC m=+302.670383309" watchObservedRunningTime="2026-04-16 18:08:27.728060846 +0000 UTC m=+302.671673901" Apr 16 18:10:20.570045 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.570011 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4"] Apr 16 18:10:20.573076 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.573057 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:10:20.575208 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.575171 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:10:20.575327 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.575253 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-77k2v\"" Apr 16 18:10:20.575774 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.575753 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:10:20.575872 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.575791 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:10:20.582140 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.582118 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4"] Apr 16 18:10:20.698140 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.698106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz4mj\" (UniqueName: \"kubernetes.io/projected/91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63-kube-api-access-xz4mj\") pod \"llmisvc-controller-manager-68cc5db7c4-dpxb4\" (UID: \"91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:10:20.698336 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.698230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dpxb4\" (UID: \"91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:10:20.798945 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.798908 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz4mj\" (UniqueName: \"kubernetes.io/projected/91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63-kube-api-access-xz4mj\") pod \"llmisvc-controller-manager-68cc5db7c4-dpxb4\" (UID: \"91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:10:20.799135 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.798979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dpxb4\" (UID: \"91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:10:20.801321 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.801291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dpxb4\" (UID: \"91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:10:20.807159 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.807137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz4mj\" (UniqueName: \"kubernetes.io/projected/91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63-kube-api-access-xz4mj\") pod \"llmisvc-controller-manager-68cc5db7c4-dpxb4\" (UID: \"91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:10:20.884680 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:20.884594 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:10:21.001442 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:21.001410 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4"] Apr 16 18:10:21.004175 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:10:21.004136 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod91d213d4_bf2d_4e43_8f99_8ffa7ad0ed63.slice/crio-e863bfebc2181b5ec129f5415df0f1c62cfa4a77c83824e322c55e8538f5b7da WatchSource:0}: Error finding container e863bfebc2181b5ec129f5415df0f1c62cfa4a77c83824e322c55e8538f5b7da: Status 404 returned error can't find the container with id e863bfebc2181b5ec129f5415df0f1c62cfa4a77c83824e322c55e8538f5b7da Apr 16 18:10:21.005468 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:21.005451 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:10:21.022109 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:21.022082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" event={"ID":"91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63","Type":"ContainerStarted","Data":"e863bfebc2181b5ec129f5415df0f1c62cfa4a77c83824e322c55e8538f5b7da"} Apr 16 18:10:23.029252 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:23.029223 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" event={"ID":"91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63","Type":"ContainerStarted","Data":"2b41f4ee58cc0478de37471e5d329d59105f870a71029ed37837e3d95dc3fed5"} Apr 16 18:10:23.029613 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:23.029313 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:10:23.046327 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:23.046280 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" podStartSLOduration=1.159680952 podStartE2EDuration="3.046267087s" podCreationTimestamp="2026-04-16 18:10:20 +0000 UTC" firstStartedPulling="2026-04-16 18:10:21.005642052 +0000 UTC m=+415.949255096" lastFinishedPulling="2026-04-16 18:10:22.892228196 +0000 UTC m=+417.835841231" observedRunningTime="2026-04-16 18:10:23.044809613 +0000 UTC m=+417.988422669" watchObservedRunningTime="2026-04-16 18:10:23.046267087 +0000 UTC m=+417.989880142" Apr 16 18:10:54.034420 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:10:54.034391 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dpxb4" Apr 16 18:11:44.935114 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:44.935070 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-sr59x"] Apr 16 18:11:44.938497 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:44.938474 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sr59x" Apr 16 18:11:44.941493 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:44.941472 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-h7q2z\"" Apr 16 18:11:44.941607 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:44.941493 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:11:44.949659 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:44.949637 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sr59x"] Apr 16 18:11:45.102386 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:45.102355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r644\" (UniqueName: \"kubernetes.io/projected/d5d1d706-2c44-42fd-8877-a5959df4e519-kube-api-access-5r644\") pod \"s3-init-sr59x\" (UID: \"d5d1d706-2c44-42fd-8877-a5959df4e519\") " pod="kserve/s3-init-sr59x" Apr 16 18:11:45.202776 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:45.202750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r644\" (UniqueName: \"kubernetes.io/projected/d5d1d706-2c44-42fd-8877-a5959df4e519-kube-api-access-5r644\") pod \"s3-init-sr59x\" (UID: \"d5d1d706-2c44-42fd-8877-a5959df4e519\") " pod="kserve/s3-init-sr59x" Apr 16 18:11:45.212172 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:45.212142 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r644\" (UniqueName: \"kubernetes.io/projected/d5d1d706-2c44-42fd-8877-a5959df4e519-kube-api-access-5r644\") pod \"s3-init-sr59x\" (UID: \"d5d1d706-2c44-42fd-8877-a5959df4e519\") " pod="kserve/s3-init-sr59x" Apr 16 18:11:45.256352 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:45.256327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sr59x" Apr 16 18:11:45.379762 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:45.379729 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sr59x"] Apr 16 18:11:45.383068 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:11:45.383044 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5d1d706_2c44_42fd_8877_a5959df4e519.slice/crio-46d14f809cebc4ba1f0f79f9d31f3889721a647827410ba491b646a6c8b8d1b7 WatchSource:0}: Error finding container 46d14f809cebc4ba1f0f79f9d31f3889721a647827410ba491b646a6c8b8d1b7: Status 404 returned error can't find the container with id 46d14f809cebc4ba1f0f79f9d31f3889721a647827410ba491b646a6c8b8d1b7 Apr 16 18:11:46.270723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:46.270670 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sr59x" event={"ID":"d5d1d706-2c44-42fd-8877-a5959df4e519","Type":"ContainerStarted","Data":"46d14f809cebc4ba1f0f79f9d31f3889721a647827410ba491b646a6c8b8d1b7"} Apr 16 18:11:50.286296 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:50.286263 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sr59x" event={"ID":"d5d1d706-2c44-42fd-8877-a5959df4e519","Type":"ContainerStarted","Data":"437e47ba6632c35733c495a8e5eb6bf83f7ce023fcfaf796623b7e1045e0d961"} Apr 16 18:11:50.304652 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:50.304591 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-sr59x" podStartSLOduration=1.843763897 podStartE2EDuration="6.304573137s" podCreationTimestamp="2026-04-16 18:11:44 +0000 UTC" firstStartedPulling="2026-04-16 18:11:45.384843449 +0000 UTC m=+500.328456487" lastFinishedPulling="2026-04-16 18:11:49.845652681 +0000 UTC m=+504.789265727" observedRunningTime="2026-04-16 18:11:50.303363241 +0000 UTC m=+505.246976294" watchObservedRunningTime="2026-04-16 18:11:50.304573137 +0000 UTC m=+505.248186194" Apr 16 18:11:53.295775 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:53.295737 2578 generic.go:358] "Generic (PLEG): container finished" podID="d5d1d706-2c44-42fd-8877-a5959df4e519" containerID="437e47ba6632c35733c495a8e5eb6bf83f7ce023fcfaf796623b7e1045e0d961" exitCode=0 Apr 16 18:11:53.296141 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:53.295801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sr59x" event={"ID":"d5d1d706-2c44-42fd-8877-a5959df4e519","Type":"ContainerDied","Data":"437e47ba6632c35733c495a8e5eb6bf83f7ce023fcfaf796623b7e1045e0d961"} Apr 16 18:11:54.421428 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:54.421408 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sr59x" Apr 16 18:11:54.485740 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:54.485715 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r644\" (UniqueName: \"kubernetes.io/projected/d5d1d706-2c44-42fd-8877-a5959df4e519-kube-api-access-5r644\") pod \"d5d1d706-2c44-42fd-8877-a5959df4e519\" (UID: \"d5d1d706-2c44-42fd-8877-a5959df4e519\") " Apr 16 18:11:54.487794 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:54.487765 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d1d706-2c44-42fd-8877-a5959df4e519-kube-api-access-5r644" (OuterVolumeSpecName: "kube-api-access-5r644") pod "d5d1d706-2c44-42fd-8877-a5959df4e519" (UID: "d5d1d706-2c44-42fd-8877-a5959df4e519"). InnerVolumeSpecName "kube-api-access-5r644". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:54.586456 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:54.586383 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5r644\" (UniqueName: \"kubernetes.io/projected/d5d1d706-2c44-42fd-8877-a5959df4e519-kube-api-access-5r644\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:11:55.302461 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:55.302436 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sr59x" Apr 16 18:11:55.302646 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:55.302435 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sr59x" event={"ID":"d5d1d706-2c44-42fd-8877-a5959df4e519","Type":"ContainerDied","Data":"46d14f809cebc4ba1f0f79f9d31f3889721a647827410ba491b646a6c8b8d1b7"} Apr 16 18:11:55.302646 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:11:55.302546 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46d14f809cebc4ba1f0f79f9d31f3889721a647827410ba491b646a6c8b8d1b7" Apr 16 18:12:29.404849 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.404821 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-8dnj6"] Apr 16 18:12:29.405289 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.405139 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5d1d706-2c44-42fd-8877-a5959df4e519" containerName="s3-init" Apr 16 18:12:29.405289 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.405150 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d1d706-2c44-42fd-8877-a5959df4e519" containerName="s3-init" Apr 16 18:12:29.405289 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.405211 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5d1d706-2c44-42fd-8877-a5959df4e519" containerName="s3-init" Apr 16 18:12:29.436369 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.436338 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-8dnj6"] Apr 16 18:12:29.436517 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.436438 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-8dnj6" Apr 16 18:12:29.441209 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.441174 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-h7q2z\"" Apr 16 18:12:29.441289 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.441230 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 18:12:29.595765 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.595732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh2cr\" (UniqueName: \"kubernetes.io/projected/268b1e70-1f96-45bd-9864-b90de3b6d78d-kube-api-access-wh2cr\") pod \"s3-tls-init-custom-8dnj6\" (UID: \"268b1e70-1f96-45bd-9864-b90de3b6d78d\") " pod="kserve/s3-tls-init-custom-8dnj6" Apr 16 18:12:29.696445 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.696359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh2cr\" (UniqueName: \"kubernetes.io/projected/268b1e70-1f96-45bd-9864-b90de3b6d78d-kube-api-access-wh2cr\") pod \"s3-tls-init-custom-8dnj6\" (UID: \"268b1e70-1f96-45bd-9864-b90de3b6d78d\") " pod="kserve/s3-tls-init-custom-8dnj6" Apr 16 18:12:29.705450 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.705417 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh2cr\" (UniqueName: \"kubernetes.io/projected/268b1e70-1f96-45bd-9864-b90de3b6d78d-kube-api-access-wh2cr\") pod \"s3-tls-init-custom-8dnj6\" (UID: \"268b1e70-1f96-45bd-9864-b90de3b6d78d\") " pod="kserve/s3-tls-init-custom-8dnj6" Apr 16 18:12:29.765789 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.765755 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-8dnj6" Apr 16 18:12:29.890510 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:29.890484 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-8dnj6"] Apr 16 18:12:29.892967 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:12:29.892932 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod268b1e70_1f96_45bd_9864_b90de3b6d78d.slice/crio-0f3e25cf99abcc8849f80625b91db237111d495057b3d61883ed74be2dc8dc05 WatchSource:0}: Error finding container 0f3e25cf99abcc8849f80625b91db237111d495057b3d61883ed74be2dc8dc05: Status 404 returned error can't find the container with id 0f3e25cf99abcc8849f80625b91db237111d495057b3d61883ed74be2dc8dc05 Apr 16 18:12:30.402636 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:30.402552 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-8dnj6" event={"ID":"268b1e70-1f96-45bd-9864-b90de3b6d78d","Type":"ContainerStarted","Data":"78135d112d65e71c76b7445cbab5f4608c50bdc27d6f437696ee98956377d0ca"} Apr 16 18:12:30.402636 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:30.402593 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-8dnj6" event={"ID":"268b1e70-1f96-45bd-9864-b90de3b6d78d","Type":"ContainerStarted","Data":"0f3e25cf99abcc8849f80625b91db237111d495057b3d61883ed74be2dc8dc05"} Apr 16 18:12:30.422798 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:30.422750 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-8dnj6" podStartSLOduration=1.422732206 podStartE2EDuration="1.422732206s" podCreationTimestamp="2026-04-16 18:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:30.421451697 +0000 UTC m=+545.365064755" watchObservedRunningTime="2026-04-16 18:12:30.422732206 +0000 UTC m=+545.366345263" Apr 16 18:12:35.419382 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:35.419349 2578 generic.go:358] "Generic (PLEG): container finished" podID="268b1e70-1f96-45bd-9864-b90de3b6d78d" containerID="78135d112d65e71c76b7445cbab5f4608c50bdc27d6f437696ee98956377d0ca" exitCode=0 Apr 16 18:12:35.419840 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:35.419429 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-8dnj6" event={"ID":"268b1e70-1f96-45bd-9864-b90de3b6d78d","Type":"ContainerDied","Data":"78135d112d65e71c76b7445cbab5f4608c50bdc27d6f437696ee98956377d0ca"} Apr 16 18:12:36.541340 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:36.541318 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-8dnj6" Apr 16 18:12:36.552448 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:36.552423 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh2cr\" (UniqueName: \"kubernetes.io/projected/268b1e70-1f96-45bd-9864-b90de3b6d78d-kube-api-access-wh2cr\") pod \"268b1e70-1f96-45bd-9864-b90de3b6d78d\" (UID: \"268b1e70-1f96-45bd-9864-b90de3b6d78d\") " Apr 16 18:12:36.554339 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:36.554311 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/268b1e70-1f96-45bd-9864-b90de3b6d78d-kube-api-access-wh2cr" (OuterVolumeSpecName: "kube-api-access-wh2cr") pod "268b1e70-1f96-45bd-9864-b90de3b6d78d" (UID: "268b1e70-1f96-45bd-9864-b90de3b6d78d"). InnerVolumeSpecName "kube-api-access-wh2cr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:36.653898 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:36.653859 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wh2cr\" (UniqueName: \"kubernetes.io/projected/268b1e70-1f96-45bd-9864-b90de3b6d78d-kube-api-access-wh2cr\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:12:37.425852 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:37.425816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-8dnj6" event={"ID":"268b1e70-1f96-45bd-9864-b90de3b6d78d","Type":"ContainerDied","Data":"0f3e25cf99abcc8849f80625b91db237111d495057b3d61883ed74be2dc8dc05"} Apr 16 18:12:37.425852 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:37.425831 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-8dnj6" Apr 16 18:12:37.425852 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:37.425849 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f3e25cf99abcc8849f80625b91db237111d495057b3d61883ed74be2dc8dc05" Apr 16 18:12:39.713052 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.713018 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-d8ts8"] Apr 16 18:12:39.713477 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.713342 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="268b1e70-1f96-45bd-9864-b90de3b6d78d" containerName="s3-tls-init-custom" Apr 16 18:12:39.713477 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.713354 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="268b1e70-1f96-45bd-9864-b90de3b6d78d" containerName="s3-tls-init-custom" Apr 16 18:12:39.713477 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.713419 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="268b1e70-1f96-45bd-9864-b90de3b6d78d" containerName="s3-tls-init-custom" Apr 16 18:12:39.716277 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.716262 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-d8ts8" Apr 16 18:12:39.719082 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.719065 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-h7q2z\"" Apr 16 18:12:39.719182 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.719102 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 18:12:39.726393 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.726374 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-d8ts8"] Apr 16 18:12:39.780988 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.780958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5b66\" (UniqueName: \"kubernetes.io/projected/46b85c45-3c3c-4bc2-887b-1773012871f3-kube-api-access-w5b66\") pod \"s3-tls-init-serving-d8ts8\" (UID: \"46b85c45-3c3c-4bc2-887b-1773012871f3\") " pod="kserve/s3-tls-init-serving-d8ts8" Apr 16 18:12:39.882283 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.882246 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5b66\" (UniqueName: \"kubernetes.io/projected/46b85c45-3c3c-4bc2-887b-1773012871f3-kube-api-access-w5b66\") pod \"s3-tls-init-serving-d8ts8\" (UID: \"46b85c45-3c3c-4bc2-887b-1773012871f3\") " pod="kserve/s3-tls-init-serving-d8ts8" Apr 16 18:12:39.891423 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:39.891393 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5b66\" (UniqueName: \"kubernetes.io/projected/46b85c45-3c3c-4bc2-887b-1773012871f3-kube-api-access-w5b66\") pod \"s3-tls-init-serving-d8ts8\" (UID: \"46b85c45-3c3c-4bc2-887b-1773012871f3\") " pod="kserve/s3-tls-init-serving-d8ts8" Apr 16 18:12:40.035851 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:40.035767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-d8ts8" Apr 16 18:12:40.170989 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:40.170957 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-d8ts8"] Apr 16 18:12:40.174422 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:12:40.174394 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46b85c45_3c3c_4bc2_887b_1773012871f3.slice/crio-2da178390e493f80afd918cbe98407fe8f1d63adf2b5eb6bf7916d1317219daa WatchSource:0}: Error finding container 2da178390e493f80afd918cbe98407fe8f1d63adf2b5eb6bf7916d1317219daa: Status 404 returned error can't find the container with id 2da178390e493f80afd918cbe98407fe8f1d63adf2b5eb6bf7916d1317219daa Apr 16 18:12:40.440910 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:40.440827 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-d8ts8" event={"ID":"46b85c45-3c3c-4bc2-887b-1773012871f3","Type":"ContainerStarted","Data":"22c6fbbf098508369af688c6e1c4a31ff1bc248a670472ddd5420456811d4efe"} Apr 16 18:12:40.440910 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:40.440869 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-d8ts8" event={"ID":"46b85c45-3c3c-4bc2-887b-1773012871f3","Type":"ContainerStarted","Data":"2da178390e493f80afd918cbe98407fe8f1d63adf2b5eb6bf7916d1317219daa"} Apr 16 18:12:40.467132 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:40.467083 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-d8ts8" podStartSLOduration=1.467068485 podStartE2EDuration="1.467068485s" podCreationTimestamp="2026-04-16 18:12:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:12:40.464489425 +0000 UTC m=+555.408102481" watchObservedRunningTime="2026-04-16 18:12:40.467068485 +0000 UTC m=+555.410681541" Apr 16 18:12:44.454057 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:44.454023 2578 generic.go:358] "Generic (PLEG): container finished" podID="46b85c45-3c3c-4bc2-887b-1773012871f3" containerID="22c6fbbf098508369af688c6e1c4a31ff1bc248a670472ddd5420456811d4efe" exitCode=0 Apr 16 18:12:44.454429 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:44.454096 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-d8ts8" event={"ID":"46b85c45-3c3c-4bc2-887b-1773012871f3","Type":"ContainerDied","Data":"22c6fbbf098508369af688c6e1c4a31ff1bc248a670472ddd5420456811d4efe"} Apr 16 18:12:45.571462 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:45.571441 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-d8ts8" Apr 16 18:12:45.631803 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:45.631774 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5b66\" (UniqueName: \"kubernetes.io/projected/46b85c45-3c3c-4bc2-887b-1773012871f3-kube-api-access-w5b66\") pod \"46b85c45-3c3c-4bc2-887b-1773012871f3\" (UID: \"46b85c45-3c3c-4bc2-887b-1773012871f3\") " Apr 16 18:12:45.633796 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:45.633772 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b85c45-3c3c-4bc2-887b-1773012871f3-kube-api-access-w5b66" (OuterVolumeSpecName: "kube-api-access-w5b66") pod "46b85c45-3c3c-4bc2-887b-1773012871f3" (UID: "46b85c45-3c3c-4bc2-887b-1773012871f3"). InnerVolumeSpecName "kube-api-access-w5b66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:45.732909 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:45.732828 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5b66\" (UniqueName: \"kubernetes.io/projected/46b85c45-3c3c-4bc2-887b-1773012871f3-kube-api-access-w5b66\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:12:46.460577 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:46.460533 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-d8ts8" event={"ID":"46b85c45-3c3c-4bc2-887b-1773012871f3","Type":"ContainerDied","Data":"2da178390e493f80afd918cbe98407fe8f1d63adf2b5eb6bf7916d1317219daa"} Apr 16 18:12:46.460577 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:46.460570 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-d8ts8" Apr 16 18:12:46.460785 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:46.460572 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2da178390e493f80afd918cbe98407fe8f1d63adf2b5eb6bf7916d1317219daa" Apr 16 18:12:56.594134 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.594099 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz"] Apr 16 18:12:56.594733 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.594624 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46b85c45-3c3c-4bc2-887b-1773012871f3" containerName="s3-tls-init-serving" Apr 16 18:12:56.594733 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.594646 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b85c45-3c3c-4bc2-887b-1773012871f3" containerName="s3-tls-init-serving" Apr 16 18:12:56.594851 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.594744 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="46b85c45-3c3c-4bc2-887b-1773012871f3" containerName="s3-tls-init-serving" Apr 16 18:12:56.597856 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.597836 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:12:56.600218 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.600200 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-89kqm\"" Apr 16 18:12:56.609500 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.609476 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz"] Apr 16 18:12:56.729021 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.728972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26617e5f-a1ce-468d-a746-438ab7428210-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz\" (UID: \"26617e5f-a1ce-468d-a746-438ab7428210\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:12:56.830310 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.830264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26617e5f-a1ce-468d-a746-438ab7428210-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz\" (UID: \"26617e5f-a1ce-468d-a746-438ab7428210\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:12:56.830645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.830626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26617e5f-a1ce-468d-a746-438ab7428210-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz\" (UID: \"26617e5f-a1ce-468d-a746-438ab7428210\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:12:56.907263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:56.907163 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:12:57.270714 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:57.270545 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz"] Apr 16 18:12:57.273300 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:12:57.273272 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26617e5f_a1ce_468d_a746_438ab7428210.slice/crio-c3ffdf2e0ca71d4f0ef2b7d0d35e7caef0e579a033bcd22cdcce922220201da3 WatchSource:0}: Error finding container c3ffdf2e0ca71d4f0ef2b7d0d35e7caef0e579a033bcd22cdcce922220201da3: Status 404 returned error can't find the container with id c3ffdf2e0ca71d4f0ef2b7d0d35e7caef0e579a033bcd22cdcce922220201da3 Apr 16 18:12:57.496541 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:12:57.496508 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" event={"ID":"26617e5f-a1ce-468d-a746-438ab7428210","Type":"ContainerStarted","Data":"c3ffdf2e0ca71d4f0ef2b7d0d35e7caef0e579a033bcd22cdcce922220201da3"} Apr 16 18:13:02.515444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:02.515367 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" event={"ID":"26617e5f-a1ce-468d-a746-438ab7428210","Type":"ContainerStarted","Data":"3aa47fd245a9ae63ad4de4bfdb7792ab1560a0cdefcbb0dcfdad9bf0923b7931"} Apr 16 18:13:06.528941 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:06.528911 2578 generic.go:358] "Generic (PLEG): container finished" podID="26617e5f-a1ce-468d-a746-438ab7428210" containerID="3aa47fd245a9ae63ad4de4bfdb7792ab1560a0cdefcbb0dcfdad9bf0923b7931" exitCode=0 Apr 16 18:13:06.529327 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:06.528988 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" event={"ID":"26617e5f-a1ce-468d-a746-438ab7428210","Type":"ContainerDied","Data":"3aa47fd245a9ae63ad4de4bfdb7792ab1560a0cdefcbb0dcfdad9bf0923b7931"} Apr 16 18:13:19.575803 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:19.575728 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" event={"ID":"26617e5f-a1ce-468d-a746-438ab7428210","Type":"ContainerStarted","Data":"9b83720e7a7e748543291c1cfee50b01e80127ecb496d75fb3a459177096d437"} Apr 16 18:13:22.586583 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:22.586504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" event={"ID":"26617e5f-a1ce-468d-a746-438ab7428210","Type":"ContainerStarted","Data":"1fa7a5a361e6756a6bb2beb1234458b5b2504d41f40becf3cb03d373695ce33d"} Apr 16 18:13:22.586967 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:22.586733 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:13:22.588003 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:22.587958 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:13:22.628170 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:22.628127 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podStartSLOduration=1.671396381 podStartE2EDuration="26.628115201s" podCreationTimestamp="2026-04-16 18:12:56 +0000 UTC" firstStartedPulling="2026-04-16 18:12:57.275180912 +0000 UTC m=+572.218793945" lastFinishedPulling="2026-04-16 18:13:22.231899717 +0000 UTC m=+597.175512765" observedRunningTime="2026-04-16 18:13:22.627544087 +0000 UTC m=+597.571157153" watchObservedRunningTime="2026-04-16 18:13:22.628115201 +0000 UTC m=+597.571728256" Apr 16 18:13:23.590392 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:23.590356 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:13:23.590793 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:23.590491 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:13:23.591259 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:23.591232 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:13:24.593140 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:24.593106 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:13:24.593584 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:24.593562 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:13:34.593545 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:34.593491 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:13:34.608458 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:34.593947 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:13:44.593739 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:44.593641 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:13:44.594186 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:44.594160 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:13:54.593476 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:54.593431 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:13:54.593936 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:13:54.593915 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:14:04.593286 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:04.593237 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:14:04.593685 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:04.593634 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:14:14.593806 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:14.593753 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:14:14.594315 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:14.594159 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:14:24.594374 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:24.594343 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:14:24.594740 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:24.594397 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:14:31.763865 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:31.763834 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz"] Apr 16 18:14:31.764269 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:31.764117 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" containerID="cri-o://9b83720e7a7e748543291c1cfee50b01e80127ecb496d75fb3a459177096d437" gracePeriod=30 Apr 16 18:14:31.764269 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:31.764209 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" containerID="cri-o://1fa7a5a361e6756a6bb2beb1234458b5b2504d41f40becf3cb03d373695ce33d" gracePeriod=30 Apr 16 18:14:31.898899 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:31.898862 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh"] Apr 16 18:14:31.902539 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:31.902523 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:14:31.920218 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:31.920175 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh"] Apr 16 18:14:31.964631 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:31.964594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5f5a7f2-4e79-4a3b-aae2-457146755274-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh\" (UID: \"b5f5a7f2-4e79-4a3b-aae2-457146755274\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:14:32.065856 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:32.065770 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5f5a7f2-4e79-4a3b-aae2-457146755274-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh\" (UID: \"b5f5a7f2-4e79-4a3b-aae2-457146755274\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:14:32.066124 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:32.066106 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5f5a7f2-4e79-4a3b-aae2-457146755274-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh\" (UID: \"b5f5a7f2-4e79-4a3b-aae2-457146755274\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:14:32.212109 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:32.212076 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:14:32.342019 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:32.341994 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh"] Apr 16 18:14:32.344665 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:14:32.344635 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5f5a7f2_4e79_4a3b_aae2_457146755274.slice/crio-3432f7d907cae86bbcb1ee976b7678685a37cf41bc09823a160e4fa5c03b13a0 WatchSource:0}: Error finding container 3432f7d907cae86bbcb1ee976b7678685a37cf41bc09823a160e4fa5c03b13a0: Status 404 returned error can't find the container with id 3432f7d907cae86bbcb1ee976b7678685a37cf41bc09823a160e4fa5c03b13a0 Apr 16 18:14:32.796397 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:32.796359 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" event={"ID":"b5f5a7f2-4e79-4a3b-aae2-457146755274","Type":"ContainerStarted","Data":"54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2"} Apr 16 18:14:32.796397 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:32.796401 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" event={"ID":"b5f5a7f2-4e79-4a3b-aae2-457146755274","Type":"ContainerStarted","Data":"3432f7d907cae86bbcb1ee976b7678685a37cf41bc09823a160e4fa5c03b13a0"} Apr 16 18:14:34.593345 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:34.593306 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:14:34.593755 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:34.593665 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:14:36.814096 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:36.814063 2578 generic.go:358] "Generic (PLEG): container finished" podID="26617e5f-a1ce-468d-a746-438ab7428210" containerID="9b83720e7a7e748543291c1cfee50b01e80127ecb496d75fb3a459177096d437" exitCode=0 Apr 16 18:14:36.814546 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:36.814161 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" event={"ID":"26617e5f-a1ce-468d-a746-438ab7428210","Type":"ContainerDied","Data":"9b83720e7a7e748543291c1cfee50b01e80127ecb496d75fb3a459177096d437"} Apr 16 18:14:36.815598 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:36.815576 2578 generic.go:358] "Generic (PLEG): container finished" podID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerID="54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2" exitCode=0 Apr 16 18:14:36.815712 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:36.815653 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" event={"ID":"b5f5a7f2-4e79-4a3b-aae2-457146755274","Type":"ContainerDied","Data":"54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2"} Apr 16 18:14:37.820107 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:37.820070 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" event={"ID":"b5f5a7f2-4e79-4a3b-aae2-457146755274","Type":"ContainerStarted","Data":"f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198"} Apr 16 18:14:37.820107 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:37.820110 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" event={"ID":"b5f5a7f2-4e79-4a3b-aae2-457146755274","Type":"ContainerStarted","Data":"65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b"} Apr 16 18:14:37.820639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:37.820421 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:14:37.821700 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:37.821675 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:14:37.839162 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:37.839119 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podStartSLOduration=6.839106524 podStartE2EDuration="6.839106524s" podCreationTimestamp="2026-04-16 18:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:14:37.838126099 +0000 UTC m=+672.781739168" watchObservedRunningTime="2026-04-16 18:14:37.839106524 +0000 UTC m=+672.782719579" Apr 16 18:14:38.823230 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:38.823204 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:14:38.823695 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:38.823323 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:14:38.824185 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:38.824165 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:14:39.826576 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:39.826536 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:14:39.827002 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:39.826829 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:14:44.593519 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:44.593469 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:14:44.593884 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:44.593788 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:14:49.826934 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:49.826888 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:14:49.827439 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:49.827416 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:14:54.593100 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:54.593064 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 16 18:14:54.593504 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:54.593185 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:14:54.593504 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:54.593300 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:14:54.593504 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:54.593403 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:14:59.826803 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:59.826760 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:14:59.827267 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:14:59.827243 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:01.900028 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:01.899998 2578 generic.go:358] "Generic (PLEG): container finished" podID="26617e5f-a1ce-468d-a746-438ab7428210" containerID="1fa7a5a361e6756a6bb2beb1234458b5b2504d41f40becf3cb03d373695ce33d" exitCode=0 Apr 16 18:15:01.900466 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:01.900067 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" event={"ID":"26617e5f-a1ce-468d-a746-438ab7428210","Type":"ContainerDied","Data":"1fa7a5a361e6756a6bb2beb1234458b5b2504d41f40becf3cb03d373695ce33d"} Apr 16 18:15:01.912414 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:01.912394 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:15:01.916809 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:01.916783 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26617e5f-a1ce-468d-a746-438ab7428210-kserve-provision-location\") pod \"26617e5f-a1ce-468d-a746-438ab7428210\" (UID: \"26617e5f-a1ce-468d-a746-438ab7428210\") " Apr 16 18:15:01.917092 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:01.917067 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26617e5f-a1ce-468d-a746-438ab7428210-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "26617e5f-a1ce-468d-a746-438ab7428210" (UID: "26617e5f-a1ce-468d-a746-438ab7428210"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:15:02.017914 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:02.017826 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26617e5f-a1ce-468d-a746-438ab7428210-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:15:02.904463 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:02.904432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" event={"ID":"26617e5f-a1ce-468d-a746-438ab7428210","Type":"ContainerDied","Data":"c3ffdf2e0ca71d4f0ef2b7d0d35e7caef0e579a033bcd22cdcce922220201da3"} Apr 16 18:15:02.904887 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:02.904477 2578 scope.go:117] "RemoveContainer" containerID="1fa7a5a361e6756a6bb2beb1234458b5b2504d41f40becf3cb03d373695ce33d" Apr 16 18:15:02.904887 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:02.904441 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz" Apr 16 18:15:02.912552 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:02.912445 2578 scope.go:117] "RemoveContainer" containerID="9b83720e7a7e748543291c1cfee50b01e80127ecb496d75fb3a459177096d437" Apr 16 18:15:02.919735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:02.919714 2578 scope.go:117] "RemoveContainer" containerID="3aa47fd245a9ae63ad4de4bfdb7792ab1560a0cdefcbb0dcfdad9bf0923b7931" Apr 16 18:15:02.925771 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:02.925747 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz"] Apr 16 18:15:02.927687 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:02.927668 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-757c9f95b4-sm5xz"] Apr 16 18:15:03.648848 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:03.648819 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26617e5f-a1ce-468d-a746-438ab7428210" path="/var/lib/kubelet/pods/26617e5f-a1ce-468d-a746-438ab7428210/volumes" Apr 16 18:15:09.826700 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:09.826603 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:15:09.827154 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:09.826994 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:19.827113 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:19.827071 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:15:19.827687 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:19.827450 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:29.827542 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:29.827493 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:15:29.827955 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:29.827936 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:39.826968 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:39.826918 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:15:39.827454 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:39.827335 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:15:49.828002 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:49.827972 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:15:49.828416 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:49.828210 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:15:56.901919 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:56.901876 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh"] Apr 16 18:15:56.902518 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:56.902208 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" containerID="cri-o://65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b" gracePeriod=30 Apr 16 18:15:56.902518 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:56.902278 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" containerID="cri-o://f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198" gracePeriod=30 Apr 16 18:15:59.827211 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:59.827148 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:15:59.827646 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:15:59.827502 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:02.090906 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:02.090874 2578 generic.go:358] "Generic (PLEG): container finished" podID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerID="65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b" exitCode=0 Apr 16 18:16:02.091277 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:02.090937 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" event={"ID":"b5f5a7f2-4e79-4a3b-aae2-457146755274","Type":"ContainerDied","Data":"65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b"} Apr 16 18:16:07.021554 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.021521 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf"] Apr 16 18:16:07.021960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.021881 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" Apr 16 18:16:07.021960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.021901 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" Apr 16 18:16:07.021960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.021916 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="storage-initializer" Apr 16 18:16:07.021960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.021921 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="storage-initializer" Apr 16 18:16:07.021960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.021937 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" Apr 16 18:16:07.021960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.021943 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" Apr 16 18:16:07.022146 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.021991 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="kserve-container" Apr 16 18:16:07.022146 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.022003 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="26617e5f-a1ce-468d-a746-438ab7428210" containerName="agent" Apr 16 18:16:07.025303 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.025286 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:16:07.037623 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.037598 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf"] Apr 16 18:16:07.076325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.076292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6379f99b-7391-4aab-b7c0-bef6c72c1769-kserve-provision-location\") pod \"isvc-logger-predictor-69fc5c8d55-lt7tf\" (UID: \"6379f99b-7391-4aab-b7c0-bef6c72c1769\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:16:07.177174 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.177138 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6379f99b-7391-4aab-b7c0-bef6c72c1769-kserve-provision-location\") pod \"isvc-logger-predictor-69fc5c8d55-lt7tf\" (UID: \"6379f99b-7391-4aab-b7c0-bef6c72c1769\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:16:07.177542 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.177524 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6379f99b-7391-4aab-b7c0-bef6c72c1769-kserve-provision-location\") pod \"isvc-logger-predictor-69fc5c8d55-lt7tf\" (UID: \"6379f99b-7391-4aab-b7c0-bef6c72c1769\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:16:07.335292 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.335185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:16:07.456115 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.456089 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf"] Apr 16 18:16:07.458738 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:16:07.458696 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6379f99b_7391_4aab_b7c0_bef6c72c1769.slice/crio-2751a97153d1399db758f7892d874c3c738bf2938939164bd1a9f7e1fe3f1371 WatchSource:0}: Error finding container 2751a97153d1399db758f7892d874c3c738bf2938939164bd1a9f7e1fe3f1371: Status 404 returned error can't find the container with id 2751a97153d1399db758f7892d874c3c738bf2938939164bd1a9f7e1fe3f1371 Apr 16 18:16:07.460566 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:07.460553 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:16:08.109223 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:08.109173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" event={"ID":"6379f99b-7391-4aab-b7c0-bef6c72c1769","Type":"ContainerStarted","Data":"4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1"} Apr 16 18:16:08.109223 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:08.109228 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" event={"ID":"6379f99b-7391-4aab-b7c0-bef6c72c1769","Type":"ContainerStarted","Data":"2751a97153d1399db758f7892d874c3c738bf2938939164bd1a9f7e1fe3f1371"} Apr 16 18:16:09.826847 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:09.826806 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:16:09.827283 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:09.827114 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:12.121513 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:12.121477 2578 generic.go:358] "Generic (PLEG): container finished" podID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerID="4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1" exitCode=0 Apr 16 18:16:12.121873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:12.121552 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" event={"ID":"6379f99b-7391-4aab-b7c0-bef6c72c1769","Type":"ContainerDied","Data":"4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1"} Apr 16 18:16:13.126892 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:13.126861 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" event={"ID":"6379f99b-7391-4aab-b7c0-bef6c72c1769","Type":"ContainerStarted","Data":"6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f"} Apr 16 18:16:13.126892 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:13.126901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" event={"ID":"6379f99b-7391-4aab-b7c0-bef6c72c1769","Type":"ContainerStarted","Data":"499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586"} Apr 16 18:16:13.127374 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:13.127184 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:16:13.127374 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:13.127223 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:16:13.128742 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:13.128714 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:16:13.129434 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:13.129367 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:13.146102 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:13.146055 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podStartSLOduration=6.146035574 podStartE2EDuration="6.146035574s" podCreationTimestamp="2026-04-16 18:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:16:13.143651573 +0000 UTC m=+768.087264628" watchObservedRunningTime="2026-04-16 18:16:13.146035574 +0000 UTC m=+768.089648633" Apr 16 18:16:14.129802 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:14.129754 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:16:14.130229 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:14.130067 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:19.827094 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:19.827048 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:5000: connect: connection refused" Apr 16 18:16:19.827558 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:19.827178 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:16:19.827558 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:19.827398 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:19.827558 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:19.827508 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:16:24.130631 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:24.130591 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:16:24.131104 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:24.131077 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:27.052691 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.052668 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:16:27.162420 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.162343 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5f5a7f2-4e79-4a3b-aae2-457146755274-kserve-provision-location\") pod \"b5f5a7f2-4e79-4a3b-aae2-457146755274\" (UID: \"b5f5a7f2-4e79-4a3b-aae2-457146755274\") " Apr 16 18:16:27.162686 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.162665 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5f5a7f2-4e79-4a3b-aae2-457146755274-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5f5a7f2-4e79-4a3b-aae2-457146755274" (UID: "b5f5a7f2-4e79-4a3b-aae2-457146755274"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:16:27.172425 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.172397 2578 generic.go:358] "Generic (PLEG): container finished" podID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerID="f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198" exitCode=0 Apr 16 18:16:27.172552 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.172437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" event={"ID":"b5f5a7f2-4e79-4a3b-aae2-457146755274","Type":"ContainerDied","Data":"f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198"} Apr 16 18:16:27.172552 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.172461 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" event={"ID":"b5f5a7f2-4e79-4a3b-aae2-457146755274","Type":"ContainerDied","Data":"3432f7d907cae86bbcb1ee976b7678685a37cf41bc09823a160e4fa5c03b13a0"} Apr 16 18:16:27.172552 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.172475 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh" Apr 16 18:16:27.172653 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.172480 2578 scope.go:117] "RemoveContainer" containerID="f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198" Apr 16 18:16:27.180711 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.180694 2578 scope.go:117] "RemoveContainer" containerID="65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b" Apr 16 18:16:27.187344 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.187326 2578 scope.go:117] "RemoveContainer" containerID="54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2" Apr 16 18:16:27.193035 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.193010 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh"] Apr 16 18:16:27.194915 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.194895 2578 scope.go:117] "RemoveContainer" containerID="f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198" Apr 16 18:16:27.195229 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:16:27.195158 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198\": container with ID starting with f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198 not found: ID does not exist" containerID="f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198" Apr 16 18:16:27.195311 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.195280 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198"} err="failed to get container status \"f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198\": rpc error: code = NotFound desc = could not find container \"f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198\": container with ID starting with f000cad82eb3b142ae20a7aa2bb84bd1c660038020a35403265f2db7c20cc198 not found: ID does not exist" Apr 16 18:16:27.195311 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.195306 2578 scope.go:117] "RemoveContainer" containerID="65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b" Apr 16 18:16:27.195825 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:16:27.195803 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b\": container with ID starting with 65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b not found: ID does not exist" containerID="65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b" Apr 16 18:16:27.195900 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.195837 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b"} err="failed to get container status \"65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b\": rpc error: code = NotFound desc = could not find container \"65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b\": container with ID starting with 65f2d8156ec1c6c2b1e4494e5acbb9877c5f1adf30e0df889a9d70a542d7264b not found: ID does not exist" Apr 16 18:16:27.195900 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.195859 2578 scope.go:117] "RemoveContainer" containerID="54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2" Apr 16 18:16:27.196120 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:16:27.196099 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2\": container with ID starting with 54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2 not found: ID does not exist" containerID="54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2" Apr 16 18:16:27.196163 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.196129 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2"} err="failed to get container status \"54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2\": rpc error: code = NotFound desc = could not find container \"54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2\": container with ID starting with 54064b8f42a1f12b6bf9a093862ec0d5eedb308f5c06ce464944369f5dd2e2f2 not found: ID does not exist" Apr 16 18:16:27.197079 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.197062 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6b6f596fd9-gchnh"] Apr 16 18:16:27.263618 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.263579 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5f5a7f2-4e79-4a3b-aae2-457146755274-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:16:27.648385 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:27.648355 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" path="/var/lib/kubelet/pods/b5f5a7f2-4e79-4a3b-aae2-457146755274/volumes" Apr 16 18:16:34.130618 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:34.130574 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:16:34.131063 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:34.131038 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:44.129841 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:44.129740 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:16:44.130388 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:44.130331 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:16:54.130490 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:54.130446 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:16:54.130930 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:16:54.130908 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:04.130042 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:04.129993 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:17:04.130530 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:04.130467 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:14.130413 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:14.130361 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:17:14.130866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:14.130843 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:24.131003 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:24.130974 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:17:24.131520 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:24.131153 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:17:32.233398 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.233368 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf"] Apr 16 18:17:32.233885 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.233637 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" containerID="cri-o://499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586" gracePeriod=30 Apr 16 18:17:32.233885 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.233736 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" containerID="cri-o://6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f" gracePeriod=30 Apr 16 18:17:32.266924 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.266892 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs"] Apr 16 18:17:32.267263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.267251 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" Apr 16 18:17:32.267311 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.267264 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" Apr 16 18:17:32.267311 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.267275 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="storage-initializer" Apr 16 18:17:32.267311 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.267281 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="storage-initializer" Apr 16 18:17:32.267311 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.267289 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" Apr 16 18:17:32.267311 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.267295 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" Apr 16 18:17:32.267463 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.267348 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="agent" Apr 16 18:17:32.267463 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.267356 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5f5a7f2-4e79-4a3b-aae2-457146755274" containerName="kserve-container" Apr 16 18:17:32.271529 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.271515 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" Apr 16 18:17:32.283510 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.283486 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs"] Apr 16 18:17:32.316819 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.316786 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f938c90-4ab1-45b9-b48d-58dc276b1647-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-pmbvs\" (UID: \"2f938c90-4ab1-45b9-b48d-58dc276b1647\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" Apr 16 18:17:32.417616 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.417585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f938c90-4ab1-45b9-b48d-58dc276b1647-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-pmbvs\" (UID: \"2f938c90-4ab1-45b9-b48d-58dc276b1647\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" Apr 16 18:17:32.418021 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.417997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f938c90-4ab1-45b9-b48d-58dc276b1647-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-pmbvs\" (UID: \"2f938c90-4ab1-45b9-b48d-58dc276b1647\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" Apr 16 18:17:32.581769 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.581659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" Apr 16 18:17:32.703117 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:32.703090 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs"] Apr 16 18:17:32.704951 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:17:32.704916 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f938c90_4ab1_45b9_b48d_58dc276b1647.slice/crio-c50823313a67954d883e56d9e0ae15cf1dbbfc5bc475f868d31fa1777c9dfd36 WatchSource:0}: Error finding container c50823313a67954d883e56d9e0ae15cf1dbbfc5bc475f868d31fa1777c9dfd36: Status 404 returned error can't find the container with id c50823313a67954d883e56d9e0ae15cf1dbbfc5bc475f868d31fa1777c9dfd36 Apr 16 18:17:33.371806 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:33.371771 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" event={"ID":"2f938c90-4ab1-45b9-b48d-58dc276b1647","Type":"ContainerStarted","Data":"edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f"} Apr 16 18:17:33.371806 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:33.371809 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" event={"ID":"2f938c90-4ab1-45b9-b48d-58dc276b1647","Type":"ContainerStarted","Data":"c50823313a67954d883e56d9e0ae15cf1dbbfc5bc475f868d31fa1777c9dfd36"} Apr 16 18:17:34.130592 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:34.130545 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:17:34.130925 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:34.130897 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:37.384489 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:37.384458 2578 generic.go:358] "Generic (PLEG): container finished" podID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerID="499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586" exitCode=0 Apr 16 18:17:37.384921 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:37.384526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" event={"ID":"6379f99b-7391-4aab-b7c0-bef6c72c1769","Type":"ContainerDied","Data":"499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586"} Apr 16 18:17:37.385819 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:37.385797 2578 generic.go:358] "Generic (PLEG): container finished" podID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerID="edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f" exitCode=0 Apr 16 18:17:37.385919 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:37.385826 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" event={"ID":"2f938c90-4ab1-45b9-b48d-58dc276b1647","Type":"ContainerDied","Data":"edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f"} Apr 16 18:17:44.129834 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:44.129786 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:17:44.130327 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:44.130179 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:44.412715 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:44.412629 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" event={"ID":"2f938c90-4ab1-45b9-b48d-58dc276b1647","Type":"ContainerStarted","Data":"11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75"} Apr 16 18:17:44.412925 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:44.412896 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" Apr 16 18:17:44.414242 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:44.414218 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:17:44.432019 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:44.431970 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podStartSLOduration=5.531431439 podStartE2EDuration="12.431956408s" podCreationTimestamp="2026-04-16 18:17:32 +0000 UTC" firstStartedPulling="2026-04-16 18:17:37.387060622 +0000 UTC m=+852.330673657" lastFinishedPulling="2026-04-16 18:17:44.287585591 +0000 UTC m=+859.231198626" observedRunningTime="2026-04-16 18:17:44.430152853 +0000 UTC m=+859.373765909" watchObservedRunningTime="2026-04-16 18:17:44.431956408 +0000 UTC m=+859.375569514" Apr 16 18:17:45.416360 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:45.416316 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:17:54.130581 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:54.130535 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:8080: connect: connection refused" Apr 16 18:17:54.131047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:54.130671 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:17:54.131047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:54.130885 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:17:54.131047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:54.130956 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:17:55.416569 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:17:55.416522 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:18:02.374283 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.374263 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:18:02.466503 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.466473 2578 generic.go:358] "Generic (PLEG): container finished" podID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerID="6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f" exitCode=137 Apr 16 18:18:02.466665 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.466541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" event={"ID":"6379f99b-7391-4aab-b7c0-bef6c72c1769","Type":"ContainerDied","Data":"6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f"} Apr 16 18:18:02.466665 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.466561 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" Apr 16 18:18:02.466665 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.466580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf" event={"ID":"6379f99b-7391-4aab-b7c0-bef6c72c1769","Type":"ContainerDied","Data":"2751a97153d1399db758f7892d874c3c738bf2938939164bd1a9f7e1fe3f1371"} Apr 16 18:18:02.466665 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.466596 2578 scope.go:117] "RemoveContainer" containerID="6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f" Apr 16 18:18:02.473956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.473942 2578 scope.go:117] "RemoveContainer" containerID="499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586" Apr 16 18:18:02.474801 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.474754 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6379f99b-7391-4aab-b7c0-bef6c72c1769-kserve-provision-location\") pod \"6379f99b-7391-4aab-b7c0-bef6c72c1769\" (UID: \"6379f99b-7391-4aab-b7c0-bef6c72c1769\") " Apr 16 18:18:02.475067 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.475046 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6379f99b-7391-4aab-b7c0-bef6c72c1769-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6379f99b-7391-4aab-b7c0-bef6c72c1769" (UID: "6379f99b-7391-4aab-b7c0-bef6c72c1769"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:18:02.481603 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.481582 2578 scope.go:117] "RemoveContainer" containerID="4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1" Apr 16 18:18:02.488110 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.488096 2578 scope.go:117] "RemoveContainer" containerID="6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f" Apr 16 18:18:02.488406 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:18:02.488384 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f\": container with ID starting with 6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f not found: ID does not exist" containerID="6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f" Apr 16 18:18:02.488464 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.488414 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f"} err="failed to get container status \"6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f\": rpc error: code = NotFound desc = could not find container \"6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f\": container with ID starting with 6ddc6348ad12cccbbfc1fa5cf166546e43bbcd707ce9b49091bf4e2bee32957f not found: ID does not exist" Apr 16 18:18:02.488464 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.488433 2578 scope.go:117] "RemoveContainer" containerID="499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586" Apr 16 18:18:02.488652 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:18:02.488638 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586\": container with ID starting with 499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586 not found: ID does not exist" containerID="499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586" Apr 16 18:18:02.488696 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.488656 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586"} err="failed to get container status \"499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586\": rpc error: code = NotFound desc = could not find container \"499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586\": container with ID starting with 499d31cecee90ee3d99971db4a280132664bb68aca3915a8042d94b5563a0586 not found: ID does not exist" Apr 16 18:18:02.488696 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.488668 2578 scope.go:117] "RemoveContainer" containerID="4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1" Apr 16 18:18:02.488890 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:18:02.488875 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1\": container with ID starting with 4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1 not found: ID does not exist" containerID="4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1" Apr 16 18:18:02.488927 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.488894 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1"} err="failed to get container status \"4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1\": rpc error: code = NotFound desc = could not find container \"4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1\": container with ID starting with 4d357db06f1522a30f7610fceebc80af65805adabb7db4c709d692d64c6d0ac1 not found: ID does not exist" Apr 16 18:18:02.576021 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.575990 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6379f99b-7391-4aab-b7c0-bef6c72c1769-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:18:02.793003 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.792968 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf"] Apr 16 18:18:02.795540 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:02.795516 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-69fc5c8d55-lt7tf"] Apr 16 18:18:03.647291 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:03.647255 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" path="/var/lib/kubelet/pods/6379f99b-7391-4aab-b7c0-bef6c72c1769/volumes" Apr 16 18:18:05.417103 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:05.417061 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:18:15.416656 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:15.416564 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:18:25.416284 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:25.416241 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:18:35.416675 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:35.416632 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:18:45.417109 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:45.417056 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:18:54.643964 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:18:54.643912 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:19:04.645609 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:04.645574 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" Apr 16 18:19:12.430680 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.430644 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs"] Apr 16 18:19:12.431115 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.430940 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" containerID="cri-o://11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75" gracePeriod=30 Apr 16 18:19:12.567480 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.567446 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj"] Apr 16 18:19:12.567790 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.567778 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="storage-initializer" Apr 16 18:19:12.567844 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.567792 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="storage-initializer" Apr 16 18:19:12.567844 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.567801 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" Apr 16 18:19:12.567844 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.567807 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" Apr 16 18:19:12.567844 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.567820 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" Apr 16 18:19:12.567844 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.567826 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" Apr 16 18:19:12.568017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.567873 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="kserve-container" Apr 16 18:19:12.568017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.567881 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6379f99b-7391-4aab-b7c0-bef6c72c1769" containerName="agent" Apr 16 18:19:12.569985 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.569968 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" Apr 16 18:19:12.579641 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.579610 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj"] Apr 16 18:19:12.767186 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.767151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5369015-efb6-4ae2-a14c-9e1ad08e093b-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj\" (UID: \"b5369015-efb6-4ae2-a14c-9e1ad08e093b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" Apr 16 18:19:12.867786 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.867744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5369015-efb6-4ae2-a14c-9e1ad08e093b-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj\" (UID: \"b5369015-efb6-4ae2-a14c-9e1ad08e093b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" Apr 16 18:19:12.868121 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.868103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5369015-efb6-4ae2-a14c-9e1ad08e093b-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj\" (UID: \"b5369015-efb6-4ae2-a14c-9e1ad08e093b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" Apr 16 18:19:12.880790 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:12.880770 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" Apr 16 18:19:13.000374 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:13.000353 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj"] Apr 16 18:19:13.002868 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:19:13.002841 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5369015_efb6_4ae2_a14c_9e1ad08e093b.slice/crio-34a1273b5481a427293907cc270006507a90f481acdccbe0a9ad5850701a5c1a WatchSource:0}: Error finding container 34a1273b5481a427293907cc270006507a90f481acdccbe0a9ad5850701a5c1a: Status 404 returned error can't find the container with id 34a1273b5481a427293907cc270006507a90f481acdccbe0a9ad5850701a5c1a Apr 16 18:19:13.671859 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:13.671822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" event={"ID":"b5369015-efb6-4ae2-a14c-9e1ad08e093b","Type":"ContainerStarted","Data":"71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8"} Apr 16 18:19:13.671859 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:13.671862 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" event={"ID":"b5369015-efb6-4ae2-a14c-9e1ad08e093b","Type":"ContainerStarted","Data":"34a1273b5481a427293907cc270006507a90f481acdccbe0a9ad5850701a5c1a"} Apr 16 18:19:14.644597 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:14.644554 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 16 18:19:16.973414 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:16.973395 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" Apr 16 18:19:16.999428 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:16.999402 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f938c90-4ab1-45b9-b48d-58dc276b1647-kserve-provision-location\") pod \"2f938c90-4ab1-45b9-b48d-58dc276b1647\" (UID: \"2f938c90-4ab1-45b9-b48d-58dc276b1647\") " Apr 16 18:19:16.999684 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:16.999663 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f938c90-4ab1-45b9-b48d-58dc276b1647-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2f938c90-4ab1-45b9-b48d-58dc276b1647" (UID: "2f938c90-4ab1-45b9-b48d-58dc276b1647"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:19:17.100099 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.100065 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f938c90-4ab1-45b9-b48d-58dc276b1647-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:19:17.687853 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.687823 2578 generic.go:358] "Generic (PLEG): container finished" podID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerID="71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8" exitCode=0 Apr 16 18:19:17.688036 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.687899 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" event={"ID":"b5369015-efb6-4ae2-a14c-9e1ad08e093b","Type":"ContainerDied","Data":"71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8"} Apr 16 18:19:17.689222 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.689182 2578 generic.go:358] "Generic (PLEG): container finished" podID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerID="11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75" exitCode=0 Apr 16 18:19:17.689316 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.689226 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" event={"ID":"2f938c90-4ab1-45b9-b48d-58dc276b1647","Type":"ContainerDied","Data":"11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75"} Apr 16 18:19:17.689316 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.689255 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" event={"ID":"2f938c90-4ab1-45b9-b48d-58dc276b1647","Type":"ContainerDied","Data":"c50823313a67954d883e56d9e0ae15cf1dbbfc5bc475f868d31fa1777c9dfd36"} Apr 16 18:19:17.689316 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.689265 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs" Apr 16 18:19:17.689476 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.689270 2578 scope.go:117] "RemoveContainer" containerID="11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75" Apr 16 18:19:17.697231 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.697214 2578 scope.go:117] "RemoveContainer" containerID="edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f" Apr 16 18:19:17.704431 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.704412 2578 scope.go:117] "RemoveContainer" containerID="11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75" Apr 16 18:19:17.704711 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:19:17.704691 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75\": container with ID starting with 11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75 not found: ID does not exist" containerID="11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75" Apr 16 18:19:17.704774 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.704720 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75"} err="failed to get container status \"11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75\": rpc error: code = NotFound desc = could not find container \"11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75\": container with ID starting with 11c259ea036c99163b3d191b3d1510600cc5309e0531744a82dc52edf0735d75 not found: ID does not exist" Apr 16 18:19:17.704774 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.704736 2578 scope.go:117] "RemoveContainer" containerID="edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f" Apr 16 18:19:17.704977 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:19:17.704958 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f\": container with ID starting with edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f not found: ID does not exist" containerID="edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f" Apr 16 18:19:17.705029 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.704984 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f"} err="failed to get container status \"edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f\": rpc error: code = NotFound desc = could not find container \"edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f\": container with ID starting with edbdb55f3f5ee6b6bfd4d478b86b11d7d1d1d60022ccb6d0b8a99ed5b2ef633f not found: ID does not exist" Apr 16 18:19:17.721764 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.721733 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs"] Apr 16 18:19:17.728096 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:17.728073 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-pmbvs"] Apr 16 18:19:18.694155 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:18.694116 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" event={"ID":"b5369015-efb6-4ae2-a14c-9e1ad08e093b","Type":"ContainerStarted","Data":"98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404"} Apr 16 18:19:18.694669 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:18.694422 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" Apr 16 18:19:18.695596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:18.695565 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:19:18.713994 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:18.713951 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podStartSLOduration=6.713935278 podStartE2EDuration="6.713935278s" podCreationTimestamp="2026-04-16 18:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:18.711600101 +0000 UTC m=+953.655213157" watchObservedRunningTime="2026-04-16 18:19:18.713935278 +0000 UTC m=+953.657548364" Apr 16 18:19:19.654013 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:19.653969 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" path="/var/lib/kubelet/pods/2f938c90-4ab1-45b9-b48d-58dc276b1647/volumes" Apr 16 18:19:19.697922 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:19.697878 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:19:29.698123 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:29.698078 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:19:39.698840 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:39.698747 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:19:49.698267 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:49.698224 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:19:59.698487 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:19:59.698442 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:20:09.698457 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:09.698413 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:20:19.698614 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:19.698568 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:20:29.698794 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:29.698755 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:20:31.644681 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:31.644632 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 16 18:20:41.647654 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:41.647625 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" Apr 16 18:20:43.161320 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.161290 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj"] Apr 16 18:20:43.161768 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.161604 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" containerID="cri-o://98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404" gracePeriod=30 Apr 16 18:20:43.289206 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.289162 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2"] Apr 16 18:20:43.289512 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.289500 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" Apr 16 18:20:43.289555 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.289515 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" Apr 16 18:20:43.289555 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.289527 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="storage-initializer" Apr 16 18:20:43.289555 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.289533 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="storage-initializer" Apr 16 18:20:43.289648 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.289588 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f938c90-4ab1-45b9-b48d-58dc276b1647" containerName="kserve-container" Apr 16 18:20:43.292442 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.292426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" Apr 16 18:20:43.307208 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.307159 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2"] Apr 16 18:20:43.402564 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.402534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d1c68c6-a9d0-4258-9c90-7934b9b99621-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2\" (UID: \"2d1c68c6-a9d0-4258-9c90-7934b9b99621\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" Apr 16 18:20:43.503169 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.503130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d1c68c6-a9d0-4258-9c90-7934b9b99621-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2\" (UID: \"2d1c68c6-a9d0-4258-9c90-7934b9b99621\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" Apr 16 18:20:43.503532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.503512 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d1c68c6-a9d0-4258-9c90-7934b9b99621-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2\" (UID: \"2d1c68c6-a9d0-4258-9c90-7934b9b99621\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" Apr 16 18:20:43.601761 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.601728 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" Apr 16 18:20:43.723637 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.723605 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2"] Apr 16 18:20:43.726582 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:20:43.726554 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d1c68c6_a9d0_4258_9c90_7934b9b99621.slice/crio-d97f1065998788875214d8f98d21a32cda78ac0253015dcea7483f7dc30710b1 WatchSource:0}: Error finding container d97f1065998788875214d8f98d21a32cda78ac0253015dcea7483f7dc30710b1: Status 404 returned error can't find the container with id d97f1065998788875214d8f98d21a32cda78ac0253015dcea7483f7dc30710b1 Apr 16 18:20:43.942580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.942541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" event={"ID":"2d1c68c6-a9d0-4258-9c90-7934b9b99621","Type":"ContainerStarted","Data":"6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a"} Apr 16 18:20:43.942745 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:43.942587 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" event={"ID":"2d1c68c6-a9d0-4258-9c90-7934b9b99621","Type":"ContainerStarted","Data":"d97f1065998788875214d8f98d21a32cda78ac0253015dcea7483f7dc30710b1"} Apr 16 18:20:47.890297 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.890272 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" Apr 16 18:20:47.943842 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.943808 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5369015-efb6-4ae2-a14c-9e1ad08e093b-kserve-provision-location\") pod \"b5369015-efb6-4ae2-a14c-9e1ad08e093b\" (UID: \"b5369015-efb6-4ae2-a14c-9e1ad08e093b\") " Apr 16 18:20:47.944180 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.944159 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5369015-efb6-4ae2-a14c-9e1ad08e093b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5369015-efb6-4ae2-a14c-9e1ad08e093b" (UID: "b5369015-efb6-4ae2-a14c-9e1ad08e093b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:47.956387 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.956361 2578 generic.go:358] "Generic (PLEG): container finished" podID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerID="98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404" exitCode=0 Apr 16 18:20:47.956483 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.956424 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" event={"ID":"b5369015-efb6-4ae2-a14c-9e1ad08e093b","Type":"ContainerDied","Data":"98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404"} Apr 16 18:20:47.956483 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.956453 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" event={"ID":"b5369015-efb6-4ae2-a14c-9e1ad08e093b","Type":"ContainerDied","Data":"34a1273b5481a427293907cc270006507a90f481acdccbe0a9ad5850701a5c1a"} Apr 16 18:20:47.956483 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.956431 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj" Apr 16 18:20:47.956630 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.956510 2578 scope.go:117] "RemoveContainer" containerID="98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404" Apr 16 18:20:47.957860 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.957835 2578 generic.go:358] "Generic (PLEG): container finished" podID="2d1c68c6-a9d0-4258-9c90-7934b9b99621" containerID="6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a" exitCode=0 Apr 16 18:20:47.957970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.957904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" event={"ID":"2d1c68c6-a9d0-4258-9c90-7934b9b99621","Type":"ContainerDied","Data":"6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a"} Apr 16 18:20:47.964979 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.964963 2578 scope.go:117] "RemoveContainer" containerID="71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8" Apr 16 18:20:47.971899 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.971881 2578 scope.go:117] "RemoveContainer" containerID="98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404" Apr 16 18:20:47.972146 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:20:47.972125 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404\": container with ID starting with 98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404 not found: ID does not exist" containerID="98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404" Apr 16 18:20:47.972239 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.972157 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404"} err="failed to get container status \"98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404\": rpc error: code = NotFound desc = could not find container \"98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404\": container with ID starting with 98a5574bd17d422ae9c892d529bf109a9decbab3af80f627b1de0564154b6404 not found: ID does not exist" Apr 16 18:20:47.972239 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.972181 2578 scope.go:117] "RemoveContainer" containerID="71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8" Apr 16 18:20:47.972469 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:20:47.972451 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8\": container with ID starting with 71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8 not found: ID does not exist" containerID="71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8" Apr 16 18:20:47.972512 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.972475 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8"} err="failed to get container status \"71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8\": rpc error: code = NotFound desc = could not find container \"71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8\": container with ID starting with 71db7f52ffdbd79fd7e7e056c817720e336e9ee7f940c200d8d85552ac6180f8 not found: ID does not exist" Apr 16 18:20:47.989130 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.989067 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj"] Apr 16 18:20:47.992463 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:47.992444 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-4tzmj"] Apr 16 18:20:48.045141 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:48.045108 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5369015-efb6-4ae2-a14c-9e1ad08e093b-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:20:49.650622 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:20:49.650244 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" path="/var/lib/kubelet/pods/b5369015-efb6-4ae2-a14c-9e1ad08e093b/volumes" Apr 16 18:23:09.439818 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:09.436339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" event={"ID":"2d1c68c6-a9d0-4258-9c90-7934b9b99621","Type":"ContainerStarted","Data":"c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6"} Apr 16 18:23:09.439818 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:09.437362 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" Apr 16 18:23:09.468393 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:09.468312 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" podStartSLOduration=5.99062056 podStartE2EDuration="2m26.468294835s" podCreationTimestamp="2026-04-16 18:20:43 +0000 UTC" firstStartedPulling="2026-04-16 18:20:47.958977031 +0000 UTC m=+1042.902590072" lastFinishedPulling="2026-04-16 18:23:08.436651307 +0000 UTC m=+1183.380264347" observedRunningTime="2026-04-16 18:23:09.467097713 +0000 UTC m=+1184.410710769" watchObservedRunningTime="2026-04-16 18:23:09.468294835 +0000 UTC m=+1184.411907893" Apr 16 18:23:41.448413 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:41.448381 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" Apr 16 18:23:43.456665 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.456624 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2"] Apr 16 18:23:43.457122 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.456937 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" podUID="2d1c68c6-a9d0-4258-9c90-7934b9b99621" containerName="kserve-container" containerID="cri-o://c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6" gracePeriod=30 Apr 16 18:23:43.573243 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.573204 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9"] Apr 16 18:23:43.573580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.573566 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" Apr 16 18:23:43.573623 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.573582 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" Apr 16 18:23:43.573623 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.573591 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="storage-initializer" Apr 16 18:23:43.573623 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.573597 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="storage-initializer" Apr 16 18:23:43.573717 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.573644 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5369015-efb6-4ae2-a14c-9e1ad08e093b" containerName="kserve-container" Apr 16 18:23:43.594270 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.594245 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9"] Apr 16 18:23:43.594419 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.594373 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" Apr 16 18:23:43.688235 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.688171 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9291b-ab76-4262-b891-efbbad7c2b3a-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9\" (UID: \"23f9291b-ab76-4262-b891-efbbad7c2b3a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" Apr 16 18:23:43.788981 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.788899 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9291b-ab76-4262-b891-efbbad7c2b3a-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9\" (UID: \"23f9291b-ab76-4262-b891-efbbad7c2b3a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" Apr 16 18:23:43.789307 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.789284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9291b-ab76-4262-b891-efbbad7c2b3a-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9\" (UID: \"23f9291b-ab76-4262-b891-efbbad7c2b3a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" Apr 16 18:23:43.906372 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:43.906337 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" Apr 16 18:23:44.033380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.033331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9"] Apr 16 18:23:44.038251 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:23:44.038217 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23f9291b_ab76_4262_b891_efbbad7c2b3a.slice/crio-9b2ced5ed356ebb950886aa136597aa3750399ca26bd0ebb8cbeb3bca21e0e9f WatchSource:0}: Error finding container 9b2ced5ed356ebb950886aa136597aa3750399ca26bd0ebb8cbeb3bca21e0e9f: Status 404 returned error can't find the container with id 9b2ced5ed356ebb950886aa136597aa3750399ca26bd0ebb8cbeb3bca21e0e9f Apr 16 18:23:44.039894 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.039876 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:23:44.495727 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.495704 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" Apr 16 18:23:44.541407 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.541377 2578 generic.go:358] "Generic (PLEG): container finished" podID="2d1c68c6-a9d0-4258-9c90-7934b9b99621" containerID="c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6" exitCode=0 Apr 16 18:23:44.541574 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.541419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" event={"ID":"2d1c68c6-a9d0-4258-9c90-7934b9b99621","Type":"ContainerDied","Data":"c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6"} Apr 16 18:23:44.541574 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.541455 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" event={"ID":"2d1c68c6-a9d0-4258-9c90-7934b9b99621","Type":"ContainerDied","Data":"d97f1065998788875214d8f98d21a32cda78ac0253015dcea7483f7dc30710b1"} Apr 16 18:23:44.541574 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.541473 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2" Apr 16 18:23:44.541574 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.541475 2578 scope.go:117] "RemoveContainer" containerID="c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6" Apr 16 18:23:44.542991 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.542966 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" event={"ID":"23f9291b-ab76-4262-b891-efbbad7c2b3a","Type":"ContainerStarted","Data":"e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587"} Apr 16 18:23:44.543115 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.542996 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" event={"ID":"23f9291b-ab76-4262-b891-efbbad7c2b3a","Type":"ContainerStarted","Data":"9b2ced5ed356ebb950886aa136597aa3750399ca26bd0ebb8cbeb3bca21e0e9f"} Apr 16 18:23:44.549884 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.549859 2578 scope.go:117] "RemoveContainer" containerID="6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a" Apr 16 18:23:44.556870 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.556841 2578 scope.go:117] "RemoveContainer" containerID="c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6" Apr 16 18:23:44.557137 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:23:44.557117 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6\": container with ID starting with c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6 not found: ID does not exist" containerID="c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6" Apr 16 18:23:44.557370 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.557143 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6"} err="failed to get container status \"c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6\": rpc error: code = NotFound desc = could not find container \"c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6\": container with ID starting with c5d7671caba35923e6a348ea5468d4d3356b56d033b1b92722d83454fc9345a6 not found: ID does not exist" Apr 16 18:23:44.557370 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.557160 2578 scope.go:117] "RemoveContainer" containerID="6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a" Apr 16 18:23:44.557483 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:23:44.557461 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a\": container with ID starting with 6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a not found: ID does not exist" containerID="6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a" Apr 16 18:23:44.557483 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.557476 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a"} err="failed to get container status \"6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a\": rpc error: code = NotFound desc = could not find container \"6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a\": container with ID starting with 6a6a0a7b0c28e1e5c6c81514f80752d9ac643401d6d682540c64d66114325c3a not found: ID does not exist" Apr 16 18:23:44.596889 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.596825 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d1c68c6-a9d0-4258-9c90-7934b9b99621-kserve-provision-location\") pod \"2d1c68c6-a9d0-4258-9c90-7934b9b99621\" (UID: \"2d1c68c6-a9d0-4258-9c90-7934b9b99621\") " Apr 16 18:23:44.597116 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.597094 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d1c68c6-a9d0-4258-9c90-7934b9b99621-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2d1c68c6-a9d0-4258-9c90-7934b9b99621" (UID: "2d1c68c6-a9d0-4258-9c90-7934b9b99621"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:23:44.698027 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.697981 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d1c68c6-a9d0-4258-9c90-7934b9b99621-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:23:44.862073 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.862041 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2"] Apr 16 18:23:44.864178 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:44.864155 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-lgxm2"] Apr 16 18:23:45.647397 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:45.647368 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d1c68c6-a9d0-4258-9c90-7934b9b99621" path="/var/lib/kubelet/pods/2d1c68c6-a9d0-4258-9c90-7934b9b99621/volumes" Apr 16 18:23:48.557352 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:48.557314 2578 generic.go:358] "Generic (PLEG): container finished" podID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerID="e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587" exitCode=0 Apr 16 18:23:48.557739 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:48.557390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" event={"ID":"23f9291b-ab76-4262-b891-efbbad7c2b3a","Type":"ContainerDied","Data":"e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587"} Apr 16 18:23:49.562515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:49.562472 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" event={"ID":"23f9291b-ab76-4262-b891-efbbad7c2b3a","Type":"ContainerStarted","Data":"861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b"} Apr 16 18:23:49.563004 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:49.562814 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" Apr 16 18:23:49.564054 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:49.564026 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" podUID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:23:50.566409 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:23:50.566368 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" podUID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.31:8080: connect: connection refused" Apr 16 18:24:00.568113 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:00.568077 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" Apr 16 18:24:00.585016 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:00.584968 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" podStartSLOduration=17.584954917 podStartE2EDuration="17.584954917s" podCreationTimestamp="2026-04-16 18:23:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:23:49.593059012 +0000 UTC m=+1224.536672076" watchObservedRunningTime="2026-04-16 18:24:00.584954917 +0000 UTC m=+1235.528567973" Apr 16 18:24:03.591610 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.591578 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9"] Apr 16 18:24:03.591991 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.591801 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" podUID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerName="kserve-container" containerID="cri-o://861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b" gracePeriod=30 Apr 16 18:24:03.627583 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.627556 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs"] Apr 16 18:24:03.627901 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.627888 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d1c68c6-a9d0-4258-9c90-7934b9b99621" containerName="storage-initializer" Apr 16 18:24:03.628006 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.627903 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1c68c6-a9d0-4258-9c90-7934b9b99621" containerName="storage-initializer" Apr 16 18:24:03.628006 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.627940 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d1c68c6-a9d0-4258-9c90-7934b9b99621" containerName="kserve-container" Apr 16 18:24:03.628006 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.627946 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d1c68c6-a9d0-4258-9c90-7934b9b99621" containerName="kserve-container" Apr 16 18:24:03.628006 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.627998 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d1c68c6-a9d0-4258-9c90-7934b9b99621" containerName="kserve-container" Apr 16 18:24:03.631394 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.631378 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" Apr 16 18:24:03.643034 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.643009 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs"] Apr 16 18:24:03.752144 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.752104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4e9f87b-88d0-4acb-b122-939dec459ea4-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs\" (UID: \"e4e9f87b-88d0-4acb-b122-939dec459ea4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" Apr 16 18:24:03.853576 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.853489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4e9f87b-88d0-4acb-b122-939dec459ea4-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs\" (UID: \"e4e9f87b-88d0-4acb-b122-939dec459ea4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" Apr 16 18:24:03.853922 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.853902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4e9f87b-88d0-4acb-b122-939dec459ea4-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs\" (UID: \"e4e9f87b-88d0-4acb-b122-939dec459ea4\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" Apr 16 18:24:03.941240 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:03.941208 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" Apr 16 18:24:04.067130 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.067104 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs"] Apr 16 18:24:04.069934 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:24:04.069901 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4e9f87b_88d0_4acb_b122_939dec459ea4.slice/crio-448c5dc94795f609fa1bc3b6e7b23a2987b2933de71bc273ef3017eb988e7f60 WatchSource:0}: Error finding container 448c5dc94795f609fa1bc3b6e7b23a2987b2933de71bc273ef3017eb988e7f60: Status 404 returned error can't find the container with id 448c5dc94795f609fa1bc3b6e7b23a2987b2933de71bc273ef3017eb988e7f60 Apr 16 18:24:04.220342 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.220318 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" Apr 16 18:24:04.256615 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.256583 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9291b-ab76-4262-b891-efbbad7c2b3a-kserve-provision-location\") pod \"23f9291b-ab76-4262-b891-efbbad7c2b3a\" (UID: \"23f9291b-ab76-4262-b891-efbbad7c2b3a\") " Apr 16 18:24:04.256888 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.256867 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f9291b-ab76-4262-b891-efbbad7c2b3a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "23f9291b-ab76-4262-b891-efbbad7c2b3a" (UID: "23f9291b-ab76-4262-b891-efbbad7c2b3a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:24:04.357752 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.357673 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9291b-ab76-4262-b891-efbbad7c2b3a-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:24:04.607685 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.607642 2578 generic.go:358] "Generic (PLEG): container finished" podID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerID="861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b" exitCode=0 Apr 16 18:24:04.608137 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.607716 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" Apr 16 18:24:04.608137 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.607726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" event={"ID":"23f9291b-ab76-4262-b891-efbbad7c2b3a","Type":"ContainerDied","Data":"861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b"} Apr 16 18:24:04.608137 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.607755 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9" event={"ID":"23f9291b-ab76-4262-b891-efbbad7c2b3a","Type":"ContainerDied","Data":"9b2ced5ed356ebb950886aa136597aa3750399ca26bd0ebb8cbeb3bca21e0e9f"} Apr 16 18:24:04.608137 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.607776 2578 scope.go:117] "RemoveContainer" containerID="861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b" Apr 16 18:24:04.609437 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.609412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" event={"ID":"e4e9f87b-88d0-4acb-b122-939dec459ea4","Type":"ContainerStarted","Data":"c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34"} Apr 16 18:24:04.609437 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.609446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" event={"ID":"e4e9f87b-88d0-4acb-b122-939dec459ea4","Type":"ContainerStarted","Data":"448c5dc94795f609fa1bc3b6e7b23a2987b2933de71bc273ef3017eb988e7f60"} Apr 16 18:24:04.616486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.616469 2578 scope.go:117] "RemoveContainer" containerID="e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587" Apr 16 18:24:04.623876 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.623760 2578 scope.go:117] "RemoveContainer" containerID="861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b" Apr 16 18:24:04.623999 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:24:04.623981 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b\": container with ID starting with 861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b not found: ID does not exist" containerID="861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b" Apr 16 18:24:04.624039 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.624007 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b"} err="failed to get container status \"861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b\": rpc error: code = NotFound desc = could not find container \"861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b\": container with ID starting with 861102dba4fc1ad286a71ff036288749b2c5d24a71ad26c1ae3aeb6137b36c4b not found: ID does not exist" Apr 16 18:24:04.624039 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.624025 2578 scope.go:117] "RemoveContainer" containerID="e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587" Apr 16 18:24:04.624279 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:24:04.624261 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587\": container with ID starting with e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587 not found: ID does not exist" containerID="e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587" Apr 16 18:24:04.624335 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.624285 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587"} err="failed to get container status \"e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587\": rpc error: code = NotFound desc = could not find container \"e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587\": container with ID starting with e3be0af6512ac604e2ee450b440bece558de3c0d1b22d7893c4b1c606b0c8587 not found: ID does not exist" Apr 16 18:24:04.639161 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.639131 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9"] Apr 16 18:24:04.640765 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:04.640742 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-8jsq9"] Apr 16 18:24:05.652284 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:05.652250 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f9291b-ab76-4262-b891-efbbad7c2b3a" path="/var/lib/kubelet/pods/23f9291b-ab76-4262-b891-efbbad7c2b3a/volumes" Apr 16 18:24:08.623630 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:08.623599 2578 generic.go:358] "Generic (PLEG): container finished" podID="e4e9f87b-88d0-4acb-b122-939dec459ea4" containerID="c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34" exitCode=0 Apr 16 18:24:08.623998 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:08.623672 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" event={"ID":"e4e9f87b-88d0-4acb-b122-939dec459ea4","Type":"ContainerDied","Data":"c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34"} Apr 16 18:24:09.628535 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:09.628502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" event={"ID":"e4e9f87b-88d0-4acb-b122-939dec459ea4","Type":"ContainerStarted","Data":"5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a"} Apr 16 18:24:09.628957 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:09.628729 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" Apr 16 18:24:09.645502 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:09.645458 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" podStartSLOduration=6.645443841 podStartE2EDuration="6.645443841s" podCreationTimestamp="2026-04-16 18:24:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:24:09.643703666 +0000 UTC m=+1244.587316722" watchObservedRunningTime="2026-04-16 18:24:09.645443841 +0000 UTC m=+1244.589056897" Apr 16 18:24:40.636956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:40.636927 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" Apr 16 18:24:43.737862 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.737830 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs"] Apr 16 18:24:43.738351 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.738134 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" podUID="e4e9f87b-88d0-4acb-b122-939dec459ea4" containerName="kserve-container" containerID="cri-o://5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a" gracePeriod=30 Apr 16 18:24:43.803437 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.803400 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg"] Apr 16 18:24:43.803825 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.803807 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerName="kserve-container" Apr 16 18:24:43.803912 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.803827 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerName="kserve-container" Apr 16 18:24:43.803912 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.803845 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerName="storage-initializer" Apr 16 18:24:43.803912 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.803854 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerName="storage-initializer" Apr 16 18:24:43.804095 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.803950 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="23f9291b-ab76-4262-b891-efbbad7c2b3a" containerName="kserve-container" Apr 16 18:24:43.808284 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.808262 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:24:43.818738 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.818713 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg"] Apr 16 18:24:43.898246 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.898172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg\" (UID: \"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:24:43.998753 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.998652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg\" (UID: \"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:24:43.999089 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:43.999064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg\" (UID: \"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:24:44.120938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:44.120899 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:24:44.244820 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:44.244794 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg"] Apr 16 18:24:44.247632 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:24:44.247606 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9bc3b69_1a8f_4728_9d5c_4d7a2dd4321e.slice/crio-9c048a62f138c5852fa2705a2db01459eaabb5e7cc946c75635b76b9326699b1 WatchSource:0}: Error finding container 9c048a62f138c5852fa2705a2db01459eaabb5e7cc946c75635b76b9326699b1: Status 404 returned error can't find the container with id 9c048a62f138c5852fa2705a2db01459eaabb5e7cc946c75635b76b9326699b1 Apr 16 18:24:44.725820 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:44.725788 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" event={"ID":"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e","Type":"ContainerStarted","Data":"484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77"} Apr 16 18:24:44.725820 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:44.725823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" event={"ID":"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e","Type":"ContainerStarted","Data":"9c048a62f138c5852fa2705a2db01459eaabb5e7cc946c75635b76b9326699b1"} Apr 16 18:24:44.960843 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:44.960813 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" Apr 16 18:24:45.007320 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.007247 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4e9f87b-88d0-4acb-b122-939dec459ea4-kserve-provision-location\") pod \"e4e9f87b-88d0-4acb-b122-939dec459ea4\" (UID: \"e4e9f87b-88d0-4acb-b122-939dec459ea4\") " Apr 16 18:24:45.007609 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.007584 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e9f87b-88d0-4acb-b122-939dec459ea4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e4e9f87b-88d0-4acb-b122-939dec459ea4" (UID: "e4e9f87b-88d0-4acb-b122-939dec459ea4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:24:45.108603 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.108567 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4e9f87b-88d0-4acb-b122-939dec459ea4-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:24:45.729768 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.729736 2578 generic.go:358] "Generic (PLEG): container finished" podID="e4e9f87b-88d0-4acb-b122-939dec459ea4" containerID="5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a" exitCode=0 Apr 16 18:24:45.729949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.729823 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" Apr 16 18:24:45.729949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.729823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" event={"ID":"e4e9f87b-88d0-4acb-b122-939dec459ea4","Type":"ContainerDied","Data":"5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a"} Apr 16 18:24:45.729949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.729927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs" event={"ID":"e4e9f87b-88d0-4acb-b122-939dec459ea4","Type":"ContainerDied","Data":"448c5dc94795f609fa1bc3b6e7b23a2987b2933de71bc273ef3017eb988e7f60"} Apr 16 18:24:45.729949 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.729944 2578 scope.go:117] "RemoveContainer" containerID="5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a" Apr 16 18:24:45.737731 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.737677 2578 scope.go:117] "RemoveContainer" containerID="c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34" Apr 16 18:24:45.744676 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.744656 2578 scope.go:117] "RemoveContainer" containerID="5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a" Apr 16 18:24:45.744741 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.744666 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs"] Apr 16 18:24:45.744921 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:24:45.744904 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a\": container with ID starting with 5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a not found: ID does not exist" containerID="5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a" Apr 16 18:24:45.744970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.744928 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a"} err="failed to get container status \"5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a\": rpc error: code = NotFound desc = could not find container \"5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a\": container with ID starting with 5e3c608e63d9c5a23ab6e2a9206cc33e6feada6c2b7107cbe7c5899c58e17b5a not found: ID does not exist" Apr 16 18:24:45.744970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.744947 2578 scope.go:117] "RemoveContainer" containerID="c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34" Apr 16 18:24:45.745149 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:24:45.745133 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34\": container with ID starting with c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34 not found: ID does not exist" containerID="c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34" Apr 16 18:24:45.745211 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.745153 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34"} err="failed to get container status \"c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34\": rpc error: code = NotFound desc = could not find container \"c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34\": container with ID starting with c29a1cc3ea6aa516d04bbed19dab704582b8536795444c28c3de3e6151cf4f34 not found: ID does not exist" Apr 16 18:24:45.747884 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:45.747866 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-xk4hs"] Apr 16 18:24:47.648339 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:47.648309 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e9f87b-88d0-4acb-b122-939dec459ea4" path="/var/lib/kubelet/pods/e4e9f87b-88d0-4acb-b122-939dec459ea4/volumes" Apr 16 18:24:48.740334 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:48.740304 2578 generic.go:358] "Generic (PLEG): container finished" podID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerID="484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77" exitCode=0 Apr 16 18:24:48.740708 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:48.740358 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" event={"ID":"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e","Type":"ContainerDied","Data":"484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77"} Apr 16 18:24:49.745594 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:49.745558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" event={"ID":"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e","Type":"ContainerStarted","Data":"ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006"} Apr 16 18:24:51.754881 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:51.754790 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" event={"ID":"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e","Type":"ContainerStarted","Data":"a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f"} Apr 16 18:24:51.755346 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:51.754973 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:24:51.775576 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:51.775485 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" podStartSLOduration=6.77262563 podStartE2EDuration="8.775468825s" podCreationTimestamp="2026-04-16 18:24:43 +0000 UTC" firstStartedPulling="2026-04-16 18:24:48.807254822 +0000 UTC m=+1283.750867855" lastFinishedPulling="2026-04-16 18:24:50.810098013 +0000 UTC m=+1285.753711050" observedRunningTime="2026-04-16 18:24:51.773813541 +0000 UTC m=+1286.717426597" watchObservedRunningTime="2026-04-16 18:24:51.775468825 +0000 UTC m=+1286.719081874" Apr 16 18:24:52.759096 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:24:52.759061 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:25:23.765389 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:25:23.765359 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:25:53.767067 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:25:53.767037 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:26:03.892813 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.892782 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg"] Apr 16 18:26:03.893263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.893083 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-container" containerID="cri-o://ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006" gracePeriod=30 Apr 16 18:26:03.893263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.893102 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-agent" containerID="cri-o://a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f" gracePeriod=30 Apr 16 18:26:03.953466 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.953435 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx"] Apr 16 18:26:03.953853 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.953838 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e9f87b-88d0-4acb-b122-939dec459ea4" containerName="storage-initializer" Apr 16 18:26:03.953944 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.953855 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e9f87b-88d0-4acb-b122-939dec459ea4" containerName="storage-initializer" Apr 16 18:26:03.953944 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.953871 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e9f87b-88d0-4acb-b122-939dec459ea4" containerName="kserve-container" Apr 16 18:26:03.953944 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.953885 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e9f87b-88d0-4acb-b122-939dec459ea4" containerName="kserve-container" Apr 16 18:26:03.954103 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.953987 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4e9f87b-88d0-4acb-b122-939dec459ea4" containerName="kserve-container" Apr 16 18:26:03.957338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.957318 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" Apr 16 18:26:03.968737 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:03.968711 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx"] Apr 16 18:26:04.072545 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:04.072466 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1742cd93-3891-4754-b466-60079e7a7a90-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-9g7gx\" (UID: \"1742cd93-3891-4754-b466-60079e7a7a90\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" Apr 16 18:26:04.173746 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:04.173712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1742cd93-3891-4754-b466-60079e7a7a90-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-9g7gx\" (UID: \"1742cd93-3891-4754-b466-60079e7a7a90\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" Apr 16 18:26:04.174112 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:04.174092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1742cd93-3891-4754-b466-60079e7a7a90-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-9g7gx\" (UID: \"1742cd93-3891-4754-b466-60079e7a7a90\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" Apr 16 18:26:04.267781 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:04.267744 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" Apr 16 18:26:04.391062 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:04.390946 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx"] Apr 16 18:26:04.393838 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:26:04.393796 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1742cd93_3891_4754_b466_60079e7a7a90.slice/crio-3f4c043e6b42611c06e49ce3da569359d9448221a69028ab69f642814b83f486 WatchSource:0}: Error finding container 3f4c043e6b42611c06e49ce3da569359d9448221a69028ab69f642814b83f486: Status 404 returned error can't find the container with id 3f4c043e6b42611c06e49ce3da569359d9448221a69028ab69f642814b83f486 Apr 16 18:26:04.977135 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:04.977100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" event={"ID":"1742cd93-3891-4754-b466-60079e7a7a90","Type":"ContainerStarted","Data":"f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346"} Apr 16 18:26:04.977135 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:04.977134 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" event={"ID":"1742cd93-3891-4754-b466-60079e7a7a90","Type":"ContainerStarted","Data":"3f4c043e6b42611c06e49ce3da569359d9448221a69028ab69f642814b83f486"} Apr 16 18:26:06.984269 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:06.984234 2578 generic.go:358] "Generic (PLEG): container finished" podID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerID="ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006" exitCode=0 Apr 16 18:26:06.984652 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:06.984309 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" event={"ID":"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e","Type":"ContainerDied","Data":"ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006"} Apr 16 18:26:08.992257 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:08.992223 2578 generic.go:358] "Generic (PLEG): container finished" podID="1742cd93-3891-4754-b466-60079e7a7a90" containerID="f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346" exitCode=0 Apr 16 18:26:08.992593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:08.992300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" event={"ID":"1742cd93-3891-4754-b466-60079e7a7a90","Type":"ContainerDied","Data":"f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346"} Apr 16 18:26:13.762563 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:13.762522 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.33:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:26:20.035600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:20.035565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" event={"ID":"1742cd93-3891-4754-b466-60079e7a7a90","Type":"ContainerStarted","Data":"69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b"} Apr 16 18:26:20.036004 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:20.035950 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" Apr 16 18:26:20.037085 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:20.037058 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:26:20.060499 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:20.060452 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" podStartSLOduration=6.204068407 podStartE2EDuration="17.060419477s" podCreationTimestamp="2026-04-16 18:26:03 +0000 UTC" firstStartedPulling="2026-04-16 18:26:08.993402619 +0000 UTC m=+1363.937015656" lastFinishedPulling="2026-04-16 18:26:19.849753674 +0000 UTC m=+1374.793366726" observedRunningTime="2026-04-16 18:26:20.058550352 +0000 UTC m=+1375.002163407" watchObservedRunningTime="2026-04-16 18:26:20.060419477 +0000 UTC m=+1375.004032535" Apr 16 18:26:21.038667 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:21.038629 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:26:23.762950 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:23.762910 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.33:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:26:31.038837 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:31.038790 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:26:33.763135 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:33.763094 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.33:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.33:8080: connect: connection refused" Apr 16 18:26:33.763509 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:33.763248 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:26:34.079620 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.079594 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:26:34.085450 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.085426 2578 generic.go:358] "Generic (PLEG): container finished" podID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerID="a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f" exitCode=0 Apr 16 18:26:34.085577 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.085500 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" Apr 16 18:26:34.085645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.085501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" event={"ID":"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e","Type":"ContainerDied","Data":"a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f"} Apr 16 18:26:34.085645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.085607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg" event={"ID":"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e","Type":"ContainerDied","Data":"9c048a62f138c5852fa2705a2db01459eaabb5e7cc946c75635b76b9326699b1"} Apr 16 18:26:34.085645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.085624 2578 scope.go:117] "RemoveContainer" containerID="a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f" Apr 16 18:26:34.092901 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.092881 2578 scope.go:117] "RemoveContainer" containerID="ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006" Apr 16 18:26:34.099754 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.099739 2578 scope.go:117] "RemoveContainer" containerID="484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77" Apr 16 18:26:34.108363 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.108341 2578 scope.go:117] "RemoveContainer" containerID="a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f" Apr 16 18:26:34.108652 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:26:34.108631 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f\": container with ID starting with a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f not found: ID does not exist" containerID="a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f" Apr 16 18:26:34.108726 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.108661 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f"} err="failed to get container status \"a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f\": rpc error: code = NotFound desc = could not find container \"a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f\": container with ID starting with a48481a1a17f47d5736e3e9e8f8079d18dd37902560abe4614302b99163d326f not found: ID does not exist" Apr 16 18:26:34.108726 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.108681 2578 scope.go:117] "RemoveContainer" containerID="ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006" Apr 16 18:26:34.108899 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:26:34.108880 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006\": container with ID starting with ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006 not found: ID does not exist" containerID="ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006" Apr 16 18:26:34.108955 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.108903 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006"} err="failed to get container status \"ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006\": rpc error: code = NotFound desc = could not find container \"ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006\": container with ID starting with ac1f934d7a44aa456c0305d7b622a209d662c33c82a28564db2a99c5ed4c4006 not found: ID does not exist" Apr 16 18:26:34.108955 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.108917 2578 scope.go:117] "RemoveContainer" containerID="484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77" Apr 16 18:26:34.109128 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:26:34.109105 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77\": container with ID starting with 484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77 not found: ID does not exist" containerID="484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77" Apr 16 18:26:34.109184 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.109139 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77"} err="failed to get container status \"484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77\": rpc error: code = NotFound desc = could not find container \"484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77\": container with ID starting with 484a103cf2278484e8810aa447cecd2519ebf5779e40b7eaa120c8af36e4dc77 not found: ID does not exist" Apr 16 18:26:34.131898 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.131871 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e-kserve-provision-location\") pod \"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e\" (UID: \"b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e\") " Apr 16 18:26:34.132244 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.132223 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" (UID: "b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:26:34.232993 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.232964 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:26:34.406625 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.406594 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg"] Apr 16 18:26:34.410141 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:34.410117 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6dfb5d8c99-pklrg"] Apr 16 18:26:35.647533 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:35.647500 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" path="/var/lib/kubelet/pods/b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e/volumes" Apr 16 18:26:41.039179 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:41.039135 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:26:51.039351 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:26:51.039303 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:27:01.039081 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:01.039031 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 16 18:27:11.040439 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:11.040361 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" Apr 16 18:27:15.479046 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.479014 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx"] Apr 16 18:27:15.479430 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.479263 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" containerID="cri-o://69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b" gracePeriod=30 Apr 16 18:27:15.588008 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.587976 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2"] Apr 16 18:27:15.588333 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.588320 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="storage-initializer" Apr 16 18:27:15.588386 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.588334 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="storage-initializer" Apr 16 18:27:15.588386 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.588347 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-container" Apr 16 18:27:15.588386 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.588353 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-container" Apr 16 18:27:15.588386 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.588365 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-agent" Apr 16 18:27:15.588386 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.588370 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-agent" Apr 16 18:27:15.588537 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.588421 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-container" Apr 16 18:27:15.588537 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.588432 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9bc3b69-1a8f-4728-9d5c-4d7a2dd4321e" containerName="kserve-agent" Apr 16 18:27:15.591379 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.591360 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" Apr 16 18:27:15.599501 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.599480 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2"] Apr 16 18:27:15.683317 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.683286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e1d74d-a73e-46ee-9373-606f4c3a579c-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-zj8d2\" (UID: \"45e1d74d-a73e-46ee-9373-606f4c3a579c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" Apr 16 18:27:15.784561 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.784474 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e1d74d-a73e-46ee-9373-606f4c3a579c-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-zj8d2\" (UID: \"45e1d74d-a73e-46ee-9373-606f4c3a579c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" Apr 16 18:27:15.784850 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.784833 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e1d74d-a73e-46ee-9373-606f4c3a579c-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-zj8d2\" (UID: \"45e1d74d-a73e-46ee-9373-606f4c3a579c\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" Apr 16 18:27:15.902584 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:15.902542 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" Apr 16 18:27:16.021404 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:16.021374 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2"] Apr 16 18:27:16.025092 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:27:16.025067 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45e1d74d_a73e_46ee_9373_606f4c3a579c.slice/crio-91829171c8507081982a1d1cd870ab4b154b924f5a1248e3c4bece95e6d8a956 WatchSource:0}: Error finding container 91829171c8507081982a1d1cd870ab4b154b924f5a1248e3c4bece95e6d8a956: Status 404 returned error can't find the container with id 91829171c8507081982a1d1cd870ab4b154b924f5a1248e3c4bece95e6d8a956 Apr 16 18:27:16.225494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:16.225456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" event={"ID":"45e1d74d-a73e-46ee-9373-606f4c3a579c","Type":"ContainerStarted","Data":"a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765"} Apr 16 18:27:16.225494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:16.225494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" event={"ID":"45e1d74d-a73e-46ee-9373-606f4c3a579c","Type":"ContainerStarted","Data":"91829171c8507081982a1d1cd870ab4b154b924f5a1248e3c4bece95e6d8a956"} Apr 16 18:27:18.215292 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.215270 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" Apr 16 18:27:18.241265 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.241224 2578 generic.go:358] "Generic (PLEG): container finished" podID="1742cd93-3891-4754-b466-60079e7a7a90" containerID="69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b" exitCode=0 Apr 16 18:27:18.241443 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.241262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" event={"ID":"1742cd93-3891-4754-b466-60079e7a7a90","Type":"ContainerDied","Data":"69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b"} Apr 16 18:27:18.241443 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.241309 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" Apr 16 18:27:18.241443 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.241326 2578 scope.go:117] "RemoveContainer" containerID="69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b" Apr 16 18:27:18.241443 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.241314 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx" event={"ID":"1742cd93-3891-4754-b466-60079e7a7a90","Type":"ContainerDied","Data":"3f4c043e6b42611c06e49ce3da569359d9448221a69028ab69f642814b83f486"} Apr 16 18:27:18.249442 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.249422 2578 scope.go:117] "RemoveContainer" containerID="f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346" Apr 16 18:27:18.256306 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.256288 2578 scope.go:117] "RemoveContainer" containerID="69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b" Apr 16 18:27:18.256534 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:27:18.256514 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b\": container with ID starting with 69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b not found: ID does not exist" containerID="69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b" Apr 16 18:27:18.256624 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.256541 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b"} err="failed to get container status \"69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b\": rpc error: code = NotFound desc = could not find container \"69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b\": container with ID starting with 69ae0c1ffaa9db82febc740b55fe1dbf704e65d20c5091baae878082ebe9379b not found: ID does not exist" Apr 16 18:27:18.256624 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.256558 2578 scope.go:117] "RemoveContainer" containerID="f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346" Apr 16 18:27:18.256809 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:27:18.256793 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346\": container with ID starting with f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346 not found: ID does not exist" containerID="f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346" Apr 16 18:27:18.256851 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.256817 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346"} err="failed to get container status \"f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346\": rpc error: code = NotFound desc = could not find container \"f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346\": container with ID starting with f9de76e27a3cfc57d07baa23ee93f47e83cf56be89ed4046cbe7f855eaceb346 not found: ID does not exist" Apr 16 18:27:18.303291 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.303255 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1742cd93-3891-4754-b466-60079e7a7a90-kserve-provision-location\") pod \"1742cd93-3891-4754-b466-60079e7a7a90\" (UID: \"1742cd93-3891-4754-b466-60079e7a7a90\") " Apr 16 18:27:18.312366 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.312303 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1742cd93-3891-4754-b466-60079e7a7a90-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1742cd93-3891-4754-b466-60079e7a7a90" (UID: "1742cd93-3891-4754-b466-60079e7a7a90"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:27:18.404069 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.404032 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1742cd93-3891-4754-b466-60079e7a7a90-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:27:18.561228 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.561180 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx"] Apr 16 18:27:18.564550 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:18.564492 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-9g7gx"] Apr 16 18:27:19.648686 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:19.648650 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1742cd93-3891-4754-b466-60079e7a7a90" path="/var/lib/kubelet/pods/1742cd93-3891-4754-b466-60079e7a7a90/volumes" Apr 16 18:27:21.251971 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:21.251943 2578 generic.go:358] "Generic (PLEG): container finished" podID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerID="a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765" exitCode=0 Apr 16 18:27:21.252386 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:21.252015 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" event={"ID":"45e1d74d-a73e-46ee-9373-606f4c3a579c","Type":"ContainerDied","Data":"a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765"} Apr 16 18:27:22.256764 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:22.256723 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" event={"ID":"45e1d74d-a73e-46ee-9373-606f4c3a579c","Type":"ContainerStarted","Data":"6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459"} Apr 16 18:27:22.257252 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:22.257066 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" Apr 16 18:27:22.258446 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:22.258422 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:27:22.275775 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:22.275721 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" podStartSLOduration=7.275708151 podStartE2EDuration="7.275708151s" podCreationTimestamp="2026-04-16 18:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:27:22.274071856 +0000 UTC m=+1437.217684913" watchObservedRunningTime="2026-04-16 18:27:22.275708151 +0000 UTC m=+1437.219321206" Apr 16 18:27:23.260035 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:23.259997 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:27:33.260412 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:33.260373 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:27:43.260096 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:43.260052 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:27:53.260999 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:27:53.260956 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 16 18:28:03.262015 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:03.261980 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" Apr 16 18:28:07.106493 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.106460 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2"] Apr 16 18:28:07.106878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.106753 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="kserve-container" containerID="cri-o://6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459" gracePeriod=30 Apr 16 18:28:07.177362 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.177327 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2"] Apr 16 18:28:07.177677 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.177666 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="storage-initializer" Apr 16 18:28:07.177722 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.177679 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="storage-initializer" Apr 16 18:28:07.177722 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.177689 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" Apr 16 18:28:07.177722 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.177696 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" Apr 16 18:28:07.177815 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.177747 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1742cd93-3891-4754-b466-60079e7a7a90" containerName="kserve-container" Apr 16 18:28:07.180804 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.180788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" Apr 16 18:28:07.190108 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.190085 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2"] Apr 16 18:28:07.317564 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.317528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcdddae-37fe-4f1b-9c4a-e7c87063e28d-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2\" (UID: \"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" Apr 16 18:28:07.418158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.418069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcdddae-37fe-4f1b-9c4a-e7c87063e28d-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2\" (UID: \"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" Apr 16 18:28:07.418473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.418452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcdddae-37fe-4f1b-9c4a-e7c87063e28d-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2\" (UID: \"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" Apr 16 18:28:07.491325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.491292 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" Apr 16 18:28:07.609486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:07.609455 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2"] Apr 16 18:28:07.612300 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:28:07.612273 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fcdddae_37fe_4f1b_9c4a_e7c87063e28d.slice/crio-aba09cf9170235e0a9ae7ef57297e077a8bde9366a132b55106aae217031a055 WatchSource:0}: Error finding container aba09cf9170235e0a9ae7ef57297e077a8bde9366a132b55106aae217031a055: Status 404 returned error can't find the container with id aba09cf9170235e0a9ae7ef57297e077a8bde9366a132b55106aae217031a055 Apr 16 18:28:08.401450 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:08.400088 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" event={"ID":"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d","Type":"ContainerStarted","Data":"7e85b58ac13cfa65ea113c511850a5971768c00043093a07b2ef45eb86e9399d"} Apr 16 18:28:08.401450 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:08.400130 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" event={"ID":"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d","Type":"ContainerStarted","Data":"aba09cf9170235e0a9ae7ef57297e077a8bde9366a132b55106aae217031a055"} Apr 16 18:28:09.844992 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:09.844963 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" Apr 16 18:28:10.042467 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.042437 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e1d74d-a73e-46ee-9373-606f4c3a579c-kserve-provision-location\") pod \"45e1d74d-a73e-46ee-9373-606f4c3a579c\" (UID: \"45e1d74d-a73e-46ee-9373-606f4c3a579c\") " Apr 16 18:28:10.052263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.052229 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e1d74d-a73e-46ee-9373-606f4c3a579c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "45e1d74d-a73e-46ee-9373-606f4c3a579c" (UID: "45e1d74d-a73e-46ee-9373-606f4c3a579c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:10.143861 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.143822 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45e1d74d-a73e-46ee-9373-606f4c3a579c-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:28:10.406869 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.406784 2578 generic.go:358] "Generic (PLEG): container finished" podID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerID="6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459" exitCode=0 Apr 16 18:28:10.406869 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.406852 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" Apr 16 18:28:10.407061 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.406865 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" event={"ID":"45e1d74d-a73e-46ee-9373-606f4c3a579c","Type":"ContainerDied","Data":"6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459"} Apr 16 18:28:10.407061 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.406899 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2" event={"ID":"45e1d74d-a73e-46ee-9373-606f4c3a579c","Type":"ContainerDied","Data":"91829171c8507081982a1d1cd870ab4b154b924f5a1248e3c4bece95e6d8a956"} Apr 16 18:28:10.407061 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.406915 2578 scope.go:117] "RemoveContainer" containerID="6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459" Apr 16 18:28:10.415145 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.415127 2578 scope.go:117] "RemoveContainer" containerID="a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765" Apr 16 18:28:10.422019 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.422001 2578 scope.go:117] "RemoveContainer" containerID="6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459" Apr 16 18:28:10.422257 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:28:10.422241 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459\": container with ID starting with 6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459 not found: ID does not exist" containerID="6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459" Apr 16 18:28:10.422308 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.422263 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459"} err="failed to get container status \"6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459\": rpc error: code = NotFound desc = could not find container \"6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459\": container with ID starting with 6943b67275def922030d40420d06159c9467200d5f9ce2459402691ebdc93459 not found: ID does not exist" Apr 16 18:28:10.422308 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.422278 2578 scope.go:117] "RemoveContainer" containerID="a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765" Apr 16 18:28:10.422468 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:28:10.422454 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765\": container with ID starting with a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765 not found: ID does not exist" containerID="a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765" Apr 16 18:28:10.422508 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.422469 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765"} err="failed to get container status \"a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765\": rpc error: code = NotFound desc = could not find container \"a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765\": container with ID starting with a86f6dcc56e92f35b0cf3069d85e9308ecadf0b8fe64342dbfb9cde4dd129765 not found: ID does not exist" Apr 16 18:28:10.426868 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.426847 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2"] Apr 16 18:28:10.432037 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:10.432018 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zj8d2"] Apr 16 18:28:11.648706 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:11.648671 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" path="/var/lib/kubelet/pods/45e1d74d-a73e-46ee-9373-606f4c3a579c/volumes" Apr 16 18:28:12.415807 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:12.415776 2578 generic.go:358] "Generic (PLEG): container finished" podID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerID="7e85b58ac13cfa65ea113c511850a5971768c00043093a07b2ef45eb86e9399d" exitCode=0 Apr 16 18:28:12.415978 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:12.415820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" event={"ID":"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d","Type":"ContainerDied","Data":"7e85b58ac13cfa65ea113c511850a5971768c00043093a07b2ef45eb86e9399d"} Apr 16 18:28:13.420578 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:13.420548 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" event={"ID":"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d","Type":"ContainerStarted","Data":"af4c0a9762a060679a632e15c294b01f335c8d69fd22d2089ddad53f1c015d56"} Apr 16 18:28:13.421015 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:13.420851 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" Apr 16 18:28:13.422055 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:13.422028 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:28:13.437230 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:13.437158 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" podStartSLOduration=6.437141003 podStartE2EDuration="6.437141003s" podCreationTimestamp="2026-04-16 18:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:28:13.436621007 +0000 UTC m=+1488.380234063" watchObservedRunningTime="2026-04-16 18:28:13.437141003 +0000 UTC m=+1488.380754059" Apr 16 18:28:14.424041 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:14.424008 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:28:24.424217 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:24.424157 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:28:34.425015 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:34.424972 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:28:44.424757 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:44.424710 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 16 18:28:54.425272 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:54.425240 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" Apr 16 18:28:58.894922 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:58.894887 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2"] Apr 16 18:28:58.895340 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:58.895225 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="kserve-container" containerID="cri-o://af4c0a9762a060679a632e15c294b01f335c8d69fd22d2089ddad53f1c015d56" gracePeriod=30 Apr 16 18:28:59.001677 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.001639 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc"] Apr 16 18:28:59.002004 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.001990 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="storage-initializer" Apr 16 18:28:59.002057 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.002005 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="storage-initializer" Apr 16 18:28:59.002057 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.002014 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="kserve-container" Apr 16 18:28:59.002057 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.002020 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="kserve-container" Apr 16 18:28:59.002156 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.002098 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="45e1d74d-a73e-46ee-9373-606f4c3a579c" containerName="kserve-container" Apr 16 18:28:59.005110 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.005092 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" Apr 16 18:28:59.013146 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.013123 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc"] Apr 16 18:28:59.060794 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.060765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3838e5fa-bd5f-4619-bcaa-729785cf2e5d-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-52cpc\" (UID: \"3838e5fa-bd5f-4619-bcaa-729785cf2e5d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" Apr 16 18:28:59.161601 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.161511 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3838e5fa-bd5f-4619-bcaa-729785cf2e5d-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-52cpc\" (UID: \"3838e5fa-bd5f-4619-bcaa-729785cf2e5d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" Apr 16 18:28:59.161849 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.161833 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3838e5fa-bd5f-4619-bcaa-729785cf2e5d-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-52cpc\" (UID: \"3838e5fa-bd5f-4619-bcaa-729785cf2e5d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" Apr 16 18:28:59.316353 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.316309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" Apr 16 18:28:59.437056 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.437026 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc"] Apr 16 18:28:59.439450 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:28:59.439423 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3838e5fa_bd5f_4619_bcaa_729785cf2e5d.slice/crio-46dc46787bb42b7b55cabdcce9d6a3a037514dcda7445c7909d63c169883de7a WatchSource:0}: Error finding container 46dc46787bb42b7b55cabdcce9d6a3a037514dcda7445c7909d63c169883de7a: Status 404 returned error can't find the container with id 46dc46787bb42b7b55cabdcce9d6a3a037514dcda7445c7909d63c169883de7a Apr 16 18:28:59.441267 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.441253 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:28:59.568379 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.568331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" event={"ID":"3838e5fa-bd5f-4619-bcaa-729785cf2e5d","Type":"ContainerStarted","Data":"83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb"} Apr 16 18:28:59.568535 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:28:59.568391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" event={"ID":"3838e5fa-bd5f-4619-bcaa-729785cf2e5d","Type":"ContainerStarted","Data":"46dc46787bb42b7b55cabdcce9d6a3a037514dcda7445c7909d63c169883de7a"} Apr 16 18:29:01.578373 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:01.578270 2578 generic.go:358] "Generic (PLEG): container finished" podID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerID="af4c0a9762a060679a632e15c294b01f335c8d69fd22d2089ddad53f1c015d56" exitCode=0 Apr 16 18:29:01.578373 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:01.578331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" event={"ID":"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d","Type":"ContainerDied","Data":"af4c0a9762a060679a632e15c294b01f335c8d69fd22d2089ddad53f1c015d56"} Apr 16 18:29:01.639434 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:01.639409 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" Apr 16 18:29:01.686420 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:01.686390 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcdddae-37fe-4f1b-9c4a-e7c87063e28d-kserve-provision-location\") pod \"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d\" (UID: \"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d\") " Apr 16 18:29:01.695620 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:01.695588 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fcdddae-37fe-4f1b-9c4a-e7c87063e28d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" (UID: "1fcdddae-37fe-4f1b-9c4a-e7c87063e28d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:29:01.787685 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:01.787594 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fcdddae-37fe-4f1b-9c4a-e7c87063e28d-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:29:02.582421 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:02.582391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" event={"ID":"1fcdddae-37fe-4f1b-9c4a-e7c87063e28d","Type":"ContainerDied","Data":"aba09cf9170235e0a9ae7ef57297e077a8bde9366a132b55106aae217031a055"} Apr 16 18:29:02.582421 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:02.582431 2578 scope.go:117] "RemoveContainer" containerID="af4c0a9762a060679a632e15c294b01f335c8d69fd22d2089ddad53f1c015d56" Apr 16 18:29:02.582900 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:02.582440 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2" Apr 16 18:29:02.590477 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:02.590456 2578 scope.go:117] "RemoveContainer" containerID="7e85b58ac13cfa65ea113c511850a5971768c00043093a07b2ef45eb86e9399d" Apr 16 18:29:02.602561 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:02.602538 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2"] Apr 16 18:29:02.606006 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:02.605987 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-ntwl2"] Apr 16 18:29:03.587837 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:03.587751 2578 generic.go:358] "Generic (PLEG): container finished" podID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerID="83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb" exitCode=0 Apr 16 18:29:03.587837 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:03.587824 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" event={"ID":"3838e5fa-bd5f-4619-bcaa-729785cf2e5d","Type":"ContainerDied","Data":"83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb"} Apr 16 18:29:03.648630 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:03.648600 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" path="/var/lib/kubelet/pods/1fcdddae-37fe-4f1b-9c4a-e7c87063e28d/volumes" Apr 16 18:29:10.617092 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:10.617056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" event={"ID":"3838e5fa-bd5f-4619-bcaa-729785cf2e5d","Type":"ContainerStarted","Data":"bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118"} Apr 16 18:29:10.617553 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:10.617390 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" Apr 16 18:29:10.618647 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:10.618620 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:29:10.633148 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:10.633104 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podStartSLOduration=5.777776284 podStartE2EDuration="12.633090361s" podCreationTimestamp="2026-04-16 18:28:58 +0000 UTC" firstStartedPulling="2026-04-16 18:29:03.58894147 +0000 UTC m=+1538.532554503" lastFinishedPulling="2026-04-16 18:29:10.444255544 +0000 UTC m=+1545.387868580" observedRunningTime="2026-04-16 18:29:10.631521303 +0000 UTC m=+1545.575134371" watchObservedRunningTime="2026-04-16 18:29:10.633090361 +0000 UTC m=+1545.576703417" Apr 16 18:29:11.621389 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:11.621352 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:29:21.621987 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:21.621941 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:29:31.621932 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:31.621890 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:29:41.622203 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:41.622155 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:29:51.622131 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:29:51.622082 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:30:01.621489 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:01.621446 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:30:11.621921 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:11.621833 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:30:12.644366 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:12.644322 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:30:22.644792 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:22.644750 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:30:32.645898 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:32.645858 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" Apr 16 18:30:39.907640 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:39.907604 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc"] Apr 16 18:30:39.908121 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:39.907884 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" containerID="cri-o://bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118" gracePeriod=30 Apr 16 18:30:40.009995 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.009958 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz"] Apr 16 18:30:40.010389 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.010371 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="storage-initializer" Apr 16 18:30:40.010490 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.010391 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="storage-initializer" Apr 16 18:30:40.010490 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.010403 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="kserve-container" Apr 16 18:30:40.010490 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.010411 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="kserve-container" Apr 16 18:30:40.010650 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.010509 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fcdddae-37fe-4f1b-9c4a-e7c87063e28d" containerName="kserve-container" Apr 16 18:30:40.013021 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.013000 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" Apr 16 18:30:40.022649 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.022625 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz"] Apr 16 18:30:40.126435 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.126399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ce985f-43a6-4ac2-94e9-f180b18b0b45-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-jtmwz\" (UID: \"99ce985f-43a6-4ac2-94e9-f180b18b0b45\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" Apr 16 18:30:40.227716 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.227683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ce985f-43a6-4ac2-94e9-f180b18b0b45-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-jtmwz\" (UID: \"99ce985f-43a6-4ac2-94e9-f180b18b0b45\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" Apr 16 18:30:40.228069 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.228050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ce985f-43a6-4ac2-94e9-f180b18b0b45-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-jtmwz\" (UID: \"99ce985f-43a6-4ac2-94e9-f180b18b0b45\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" Apr 16 18:30:40.323842 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.323804 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" Apr 16 18:30:40.442900 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.442873 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz"] Apr 16 18:30:40.445663 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:30:40.445633 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ce985f_43a6_4ac2_94e9_f180b18b0b45.slice/crio-a99ad6b66ccd9f17038d8919ff096a0616f8ec25c7d98d5dec0b03ba124852cb WatchSource:0}: Error finding container a99ad6b66ccd9f17038d8919ff096a0616f8ec25c7d98d5dec0b03ba124852cb: Status 404 returned error can't find the container with id a99ad6b66ccd9f17038d8919ff096a0616f8ec25c7d98d5dec0b03ba124852cb Apr 16 18:30:40.894009 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.893927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" event={"ID":"99ce985f-43a6-4ac2-94e9-f180b18b0b45","Type":"ContainerStarted","Data":"5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9"} Apr 16 18:30:40.894009 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:40.893961 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" event={"ID":"99ce985f-43a6-4ac2-94e9-f180b18b0b45","Type":"ContainerStarted","Data":"a99ad6b66ccd9f17038d8919ff096a0616f8ec25c7d98d5dec0b03ba124852cb"} Apr 16 18:30:42.644722 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:42.644681 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 16 18:30:43.454139 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.454115 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" Apr 16 18:30:43.556570 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.556482 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3838e5fa-bd5f-4619-bcaa-729785cf2e5d-kserve-provision-location\") pod \"3838e5fa-bd5f-4619-bcaa-729785cf2e5d\" (UID: \"3838e5fa-bd5f-4619-bcaa-729785cf2e5d\") " Apr 16 18:30:43.556824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.556802 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3838e5fa-bd5f-4619-bcaa-729785cf2e5d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3838e5fa-bd5f-4619-bcaa-729785cf2e5d" (UID: "3838e5fa-bd5f-4619-bcaa-729785cf2e5d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:30:43.657681 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.657654 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3838e5fa-bd5f-4619-bcaa-729785cf2e5d-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:30:43.905402 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.905313 2578 generic.go:358] "Generic (PLEG): container finished" podID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerID="bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118" exitCode=0 Apr 16 18:30:43.905402 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.905364 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" event={"ID":"3838e5fa-bd5f-4619-bcaa-729785cf2e5d","Type":"ContainerDied","Data":"bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118"} Apr 16 18:30:43.905402 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.905404 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" event={"ID":"3838e5fa-bd5f-4619-bcaa-729785cf2e5d","Type":"ContainerDied","Data":"46dc46787bb42b7b55cabdcce9d6a3a037514dcda7445c7909d63c169883de7a"} Apr 16 18:30:43.905682 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.905419 2578 scope.go:117] "RemoveContainer" containerID="bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118" Apr 16 18:30:43.905682 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.905377 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc" Apr 16 18:30:43.913468 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.913351 2578 scope.go:117] "RemoveContainer" containerID="83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb" Apr 16 18:30:43.922666 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.922644 2578 scope.go:117] "RemoveContainer" containerID="bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118" Apr 16 18:30:43.923177 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:30:43.923151 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118\": container with ID starting with bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118 not found: ID does not exist" containerID="bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118" Apr 16 18:30:43.923326 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.923241 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118"} err="failed to get container status \"bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118\": rpc error: code = NotFound desc = could not find container \"bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118\": container with ID starting with bb70adaa7e9d3e9b3d16a8b1290deb7f29d13bf517ead47d36c42aea37b21118 not found: ID does not exist" Apr 16 18:30:43.923326 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.923270 2578 scope.go:117] "RemoveContainer" containerID="83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb" Apr 16 18:30:43.923635 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:30:43.923610 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb\": container with ID starting with 83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb not found: ID does not exist" containerID="83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb" Apr 16 18:30:43.923712 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.923657 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb"} err="failed to get container status \"83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb\": rpc error: code = NotFound desc = could not find container \"83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb\": container with ID starting with 83c658e73d9c4495b84f406621947be38fb7508185779b15369a51305e9494eb not found: ID does not exist" Apr 16 18:30:43.926299 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.926274 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc"] Apr 16 18:30:43.944303 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:43.944269 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-52cpc"] Apr 16 18:30:44.909633 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:44.909597 2578 generic.go:358] "Generic (PLEG): container finished" podID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerID="5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9" exitCode=0 Apr 16 18:30:44.910075 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:44.909670 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" event={"ID":"99ce985f-43a6-4ac2-94e9-f180b18b0b45","Type":"ContainerDied","Data":"5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9"} Apr 16 18:30:45.647834 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:45.647800 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" path="/var/lib/kubelet/pods/3838e5fa-bd5f-4619-bcaa-729785cf2e5d/volumes" Apr 16 18:30:45.914755 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:45.914667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" event={"ID":"99ce985f-43a6-4ac2-94e9-f180b18b0b45","Type":"ContainerStarted","Data":"afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb"} Apr 16 18:30:45.915107 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:45.914985 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" Apr 16 18:30:45.916387 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:45.916361 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:30:45.933024 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:45.932968 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podStartSLOduration=6.932949449 podStartE2EDuration="6.932949449s" podCreationTimestamp="2026-04-16 18:30:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:45.931132052 +0000 UTC m=+1640.874745109" watchObservedRunningTime="2026-04-16 18:30:45.932949449 +0000 UTC m=+1640.876562507" Apr 16 18:30:46.918596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:46.918549 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:30:56.918565 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:30:56.918522 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:31:06.918826 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:31:06.918778 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:31:16.919287 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:31:16.919241 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:31:26.919105 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:31:26.919056 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:31:36.919512 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:31:36.919411 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:31:46.918599 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:31:46.918552 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:31:48.644251 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:31:48.644209 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:31:58.645168 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:31:58.645121 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 16 18:32:08.645381 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:08.645348 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" Apr 16 18:32:10.995274 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:10.995241 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz"] Apr 16 18:32:10.995643 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:10.995476 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" containerID="cri-o://afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb" gracePeriod=30 Apr 16 18:32:11.092410 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.092375 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2"] Apr 16 18:32:11.092770 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.092756 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" Apr 16 18:32:11.092821 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.092772 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" Apr 16 18:32:11.092821 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.092788 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="storage-initializer" Apr 16 18:32:11.092821 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.092794 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="storage-initializer" Apr 16 18:32:11.092924 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.092893 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3838e5fa-bd5f-4619-bcaa-729785cf2e5d" containerName="kserve-container" Apr 16 18:32:11.095920 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.095905 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" Apr 16 18:32:11.105307 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.105283 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2"] Apr 16 18:32:11.199146 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.199113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49886d4a-d8d7-4d77-9dec-4227cc327c56-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2\" (UID: \"49886d4a-d8d7-4d77-9dec-4227cc327c56\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" Apr 16 18:32:11.299965 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.299880 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49886d4a-d8d7-4d77-9dec-4227cc327c56-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2\" (UID: \"49886d4a-d8d7-4d77-9dec-4227cc327c56\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" Apr 16 18:32:11.300304 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.300284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49886d4a-d8d7-4d77-9dec-4227cc327c56-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2\" (UID: \"49886d4a-d8d7-4d77-9dec-4227cc327c56\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" Apr 16 18:32:11.405607 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.405575 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" Apr 16 18:32:11.525424 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:11.525386 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2"] Apr 16 18:32:11.529707 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:32:11.529676 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49886d4a_d8d7_4d77_9dec_4227cc327c56.slice/crio-03c94df4d244d33425b89a744416444e0b8b8bcd95f68e9a0c3eb16eeffc44c4 WatchSource:0}: Error finding container 03c94df4d244d33425b89a744416444e0b8b8bcd95f68e9a0c3eb16eeffc44c4: Status 404 returned error can't find the container with id 03c94df4d244d33425b89a744416444e0b8b8bcd95f68e9a0c3eb16eeffc44c4 Apr 16 18:32:12.175044 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:12.175004 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" event={"ID":"49886d4a-d8d7-4d77-9dec-4227cc327c56","Type":"ContainerStarted","Data":"be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337"} Apr 16 18:32:12.175044 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:12.175037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" event={"ID":"49886d4a-d8d7-4d77-9dec-4227cc327c56","Type":"ContainerStarted","Data":"03c94df4d244d33425b89a744416444e0b8b8bcd95f68e9a0c3eb16eeffc44c4"} Apr 16 18:32:14.632605 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:14.632584 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" Apr 16 18:32:14.728572 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:14.728519 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ce985f-43a6-4ac2-94e9-f180b18b0b45-kserve-provision-location\") pod \"99ce985f-43a6-4ac2-94e9-f180b18b0b45\" (UID: \"99ce985f-43a6-4ac2-94e9-f180b18b0b45\") " Apr 16 18:32:14.728803 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:14.728784 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ce985f-43a6-4ac2-94e9-f180b18b0b45-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "99ce985f-43a6-4ac2-94e9-f180b18b0b45" (UID: "99ce985f-43a6-4ac2-94e9-f180b18b0b45"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:14.728911 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:14.728891 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99ce985f-43a6-4ac2-94e9-f180b18b0b45-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:32:15.183970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.183880 2578 generic.go:358] "Generic (PLEG): container finished" podID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerID="afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb" exitCode=0 Apr 16 18:32:15.183970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.183946 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" Apr 16 18:32:15.183970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.183962 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" event={"ID":"99ce985f-43a6-4ac2-94e9-f180b18b0b45","Type":"ContainerDied","Data":"afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb"} Apr 16 18:32:15.184221 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.183999 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz" event={"ID":"99ce985f-43a6-4ac2-94e9-f180b18b0b45","Type":"ContainerDied","Data":"a99ad6b66ccd9f17038d8919ff096a0616f8ec25c7d98d5dec0b03ba124852cb"} Apr 16 18:32:15.184221 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.184015 2578 scope.go:117] "RemoveContainer" containerID="afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb" Apr 16 18:32:15.192084 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.192067 2578 scope.go:117] "RemoveContainer" containerID="5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9" Apr 16 18:32:15.199045 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.199027 2578 scope.go:117] "RemoveContainer" containerID="afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb" Apr 16 18:32:15.199316 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:32:15.199296 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb\": container with ID starting with afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb not found: ID does not exist" containerID="afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb" Apr 16 18:32:15.199372 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.199327 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb"} err="failed to get container status \"afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb\": rpc error: code = NotFound desc = could not find container \"afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb\": container with ID starting with afece373df7045fe6f98539f11dfe77d99ff5b6773f6af072db107c23b7fc8fb not found: ID does not exist" Apr 16 18:32:15.199372 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.199347 2578 scope.go:117] "RemoveContainer" containerID="5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9" Apr 16 18:32:15.199579 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:32:15.199562 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9\": container with ID starting with 5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9 not found: ID does not exist" containerID="5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9" Apr 16 18:32:15.199621 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.199585 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9"} err="failed to get container status \"5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9\": rpc error: code = NotFound desc = could not find container \"5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9\": container with ID starting with 5937bde8537b5f9f9f6bff3fc50890555ec883e8d0df4bb4b63820abea917fe9 not found: ID does not exist" Apr 16 18:32:15.203736 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.203715 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz"] Apr 16 18:32:15.210303 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.210280 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-jtmwz"] Apr 16 18:32:15.647975 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:15.647946 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" path="/var/lib/kubelet/pods/99ce985f-43a6-4ac2-94e9-f180b18b0b45/volumes" Apr 16 18:32:16.188520 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:16.188488 2578 generic.go:358] "Generic (PLEG): container finished" podID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerID="be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337" exitCode=0 Apr 16 18:32:16.188715 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:16.188569 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" event={"ID":"49886d4a-d8d7-4d77-9dec-4227cc327c56","Type":"ContainerDied","Data":"be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337"} Apr 16 18:32:17.193345 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:17.193305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" event={"ID":"49886d4a-d8d7-4d77-9dec-4227cc327c56","Type":"ContainerStarted","Data":"6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d"} Apr 16 18:32:17.193735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:17.193617 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" Apr 16 18:32:17.194860 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:17.194832 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:32:17.211827 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:17.211782 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podStartSLOduration=6.211768387 podStartE2EDuration="6.211768387s" podCreationTimestamp="2026-04-16 18:32:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:17.210282619 +0000 UTC m=+1732.153895674" watchObservedRunningTime="2026-04-16 18:32:17.211768387 +0000 UTC m=+1732.155381446" Apr 16 18:32:18.196710 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:18.196673 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:32:28.196944 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:28.196898 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:32:38.196970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:38.196928 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:32:48.196983 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:48.196938 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:32:58.197046 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:32:58.196991 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:33:08.197825 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:08.197715 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:33:18.197490 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:18.197444 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:33:19.644760 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:19.644719 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:33:29.645076 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:29.645031 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 16 18:33:39.647757 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:39.647731 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" Apr 16 18:33:42.205339 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.205306 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2"] Apr 16 18:33:42.205812 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.205711 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" containerID="cri-o://6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d" gracePeriod=30 Apr 16 18:33:42.318530 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.318493 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5"] Apr 16 18:33:42.318835 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.318814 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="storage-initializer" Apr 16 18:33:42.318835 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.318828 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="storage-initializer" Apr 16 18:33:42.318835 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.318836 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" Apr 16 18:33:42.318960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.318842 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" Apr 16 18:33:42.318960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.318907 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="99ce985f-43a6-4ac2-94e9-f180b18b0b45" containerName="kserve-container" Apr 16 18:33:42.321867 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.321851 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" Apr 16 18:33:42.331909 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.331883 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5"] Apr 16 18:33:42.484047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.483968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1302f8-3cb2-46f5-bd14-1629ade13394-kserve-provision-location\") pod \"isvc-primary-4e36ec-predictor-668b88d45c-jm7d5\" (UID: \"7e1302f8-3cb2-46f5-bd14-1629ade13394\") " pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" Apr 16 18:33:42.584546 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.584512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1302f8-3cb2-46f5-bd14-1629ade13394-kserve-provision-location\") pod \"isvc-primary-4e36ec-predictor-668b88d45c-jm7d5\" (UID: \"7e1302f8-3cb2-46f5-bd14-1629ade13394\") " pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" Apr 16 18:33:42.584915 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.584894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1302f8-3cb2-46f5-bd14-1629ade13394-kserve-provision-location\") pod \"isvc-primary-4e36ec-predictor-668b88d45c-jm7d5\" (UID: \"7e1302f8-3cb2-46f5-bd14-1629ade13394\") " pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" Apr 16 18:33:42.632553 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.632519 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" Apr 16 18:33:42.755423 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:42.755402 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5"] Apr 16 18:33:43.455268 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:43.455225 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" event={"ID":"7e1302f8-3cb2-46f5-bd14-1629ade13394","Type":"ContainerStarted","Data":"0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd"} Apr 16 18:33:43.455268 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:43.455273 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" event={"ID":"7e1302f8-3cb2-46f5-bd14-1629ade13394","Type":"ContainerStarted","Data":"2b437b440b886c942840bd550f92e24ca9b51c5803baa3fe5612bf9981d1874d"} Apr 16 18:33:46.041761 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.041737 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" Apr 16 18:33:46.219849 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.219818 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49886d4a-d8d7-4d77-9dec-4227cc327c56-kserve-provision-location\") pod \"49886d4a-d8d7-4d77-9dec-4227cc327c56\" (UID: \"49886d4a-d8d7-4d77-9dec-4227cc327c56\") " Apr 16 18:33:46.220220 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.220176 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49886d4a-d8d7-4d77-9dec-4227cc327c56-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "49886d4a-d8d7-4d77-9dec-4227cc327c56" (UID: "49886d4a-d8d7-4d77-9dec-4227cc327c56"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:46.320723 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.320687 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49886d4a-d8d7-4d77-9dec-4227cc327c56-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:33:46.465969 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.465923 2578 generic.go:358] "Generic (PLEG): container finished" podID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerID="6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d" exitCode=0 Apr 16 18:33:46.466131 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.466002 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" Apr 16 18:33:46.466131 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.466005 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" event={"ID":"49886d4a-d8d7-4d77-9dec-4227cc327c56","Type":"ContainerDied","Data":"6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d"} Apr 16 18:33:46.466131 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.466046 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2" event={"ID":"49886d4a-d8d7-4d77-9dec-4227cc327c56","Type":"ContainerDied","Data":"03c94df4d244d33425b89a744416444e0b8b8bcd95f68e9a0c3eb16eeffc44c4"} Apr 16 18:33:46.466131 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.466062 2578 scope.go:117] "RemoveContainer" containerID="6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d" Apr 16 18:33:46.474073 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.474055 2578 scope.go:117] "RemoveContainer" containerID="be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337" Apr 16 18:33:46.480934 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.480918 2578 scope.go:117] "RemoveContainer" containerID="6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d" Apr 16 18:33:46.481177 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:33:46.481159 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d\": container with ID starting with 6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d not found: ID does not exist" containerID="6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d" Apr 16 18:33:46.481261 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.481201 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d"} err="failed to get container status \"6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d\": rpc error: code = NotFound desc = could not find container \"6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d\": container with ID starting with 6fa64a2df1c52694b7bb62c8f2c59f88657de4016730641222a91c7e1d76c79d not found: ID does not exist" Apr 16 18:33:46.481261 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.481221 2578 scope.go:117] "RemoveContainer" containerID="be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337" Apr 16 18:33:46.481463 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:33:46.481444 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337\": container with ID starting with be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337 not found: ID does not exist" containerID="be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337" Apr 16 18:33:46.481519 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.481472 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337"} err="failed to get container status \"be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337\": rpc error: code = NotFound desc = could not find container \"be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337\": container with ID starting with be44e341abff89d60a3501d941c9cf9a59a174e66512d5dca475190bcb355337 not found: ID does not exist" Apr 16 18:33:46.501511 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.501481 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2"] Apr 16 18:33:46.507847 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:46.507826 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-kpgh2"] Apr 16 18:33:47.471005 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:47.470976 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerID="0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd" exitCode=0 Apr 16 18:33:47.471409 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:47.471018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" event={"ID":"7e1302f8-3cb2-46f5-bd14-1629ade13394","Type":"ContainerDied","Data":"0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd"} Apr 16 18:33:47.648364 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:47.648331 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" path="/var/lib/kubelet/pods/49886d4a-d8d7-4d77-9dec-4227cc327c56/volumes" Apr 16 18:33:48.475266 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:48.475237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" event={"ID":"7e1302f8-3cb2-46f5-bd14-1629ade13394","Type":"ContainerStarted","Data":"f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2"} Apr 16 18:33:48.475706 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:48.475493 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" Apr 16 18:33:48.476889 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:48.476864 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:33:48.491901 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:48.491849 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podStartSLOduration=6.49183459 podStartE2EDuration="6.49183459s" podCreationTimestamp="2026-04-16 18:33:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:48.491509554 +0000 UTC m=+1823.435122608" watchObservedRunningTime="2026-04-16 18:33:48.49183459 +0000 UTC m=+1823.435447645" Apr 16 18:33:49.478934 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:49.478899 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:33:59.479069 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:33:59.479025 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:34:09.479479 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:34:09.479427 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:34:19.479468 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:34:19.479421 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:34:29.479316 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:34:29.479270 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:34:39.479855 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:34:39.479769 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:34:49.479268 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:34:49.479221 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 16 18:34:59.480373 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:34:59.480340 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" Apr 16 18:35:02.456129 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.456091 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm"] Apr 16 18:35:02.456534 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.456441 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" Apr 16 18:35:02.456534 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.456453 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" Apr 16 18:35:02.456534 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.456466 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="storage-initializer" Apr 16 18:35:02.456534 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.456473 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="storage-initializer" Apr 16 18:35:02.456534 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.456530 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="49886d4a-d8d7-4d77-9dec-4227cc327c56" containerName="kserve-container" Apr 16 18:35:02.459785 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.459767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:02.461960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.461939 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-4e36ec-dockercfg-5dhfx\"" Apr 16 18:35:02.462590 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.462572 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-4e36ec\"" Apr 16 18:35:02.462675 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.462650 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 18:35:02.469966 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.469941 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm"] Apr 16 18:35:02.565043 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.565006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73953d20-d18b-4986-bfbc-086af34d2d27-kserve-provision-location\") pod \"isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm\" (UID: \"73953d20-d18b-4986-bfbc-086af34d2d27\") " pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:02.565253 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.565113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/73953d20-d18b-4986-bfbc-086af34d2d27-cabundle-cert\") pod \"isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm\" (UID: \"73953d20-d18b-4986-bfbc-086af34d2d27\") " pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:02.665911 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.665873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/73953d20-d18b-4986-bfbc-086af34d2d27-cabundle-cert\") pod \"isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm\" (UID: \"73953d20-d18b-4986-bfbc-086af34d2d27\") " pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:02.666087 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.665920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73953d20-d18b-4986-bfbc-086af34d2d27-kserve-provision-location\") pod \"isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm\" (UID: \"73953d20-d18b-4986-bfbc-086af34d2d27\") " pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:02.666287 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.666272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73953d20-d18b-4986-bfbc-086af34d2d27-kserve-provision-location\") pod \"isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm\" (UID: \"73953d20-d18b-4986-bfbc-086af34d2d27\") " pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:02.666562 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.666543 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/73953d20-d18b-4986-bfbc-086af34d2d27-cabundle-cert\") pod \"isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm\" (UID: \"73953d20-d18b-4986-bfbc-086af34d2d27\") " pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:02.770446 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.770348 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:02.894374 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.894342 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm"] Apr 16 18:35:02.897474 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:35:02.897447 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73953d20_d18b_4986_bfbc_086af34d2d27.slice/crio-0747ba67ebbb6ab2554020f84b99fd0537bb5c7288bba67e720f71452cc638cc WatchSource:0}: Error finding container 0747ba67ebbb6ab2554020f84b99fd0537bb5c7288bba67e720f71452cc638cc: Status 404 returned error can't find the container with id 0747ba67ebbb6ab2554020f84b99fd0537bb5c7288bba67e720f71452cc638cc Apr 16 18:35:02.899292 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:02.899273 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:35:03.699371 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:03.699333 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" event={"ID":"73953d20-d18b-4986-bfbc-086af34d2d27","Type":"ContainerStarted","Data":"9a6a84dbee1e6e8ffc85812543fab4ddc0a9cb057e50e7747dcf8e8828dc115e"} Apr 16 18:35:03.699371 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:03.699378 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" event={"ID":"73953d20-d18b-4986-bfbc-086af34d2d27","Type":"ContainerStarted","Data":"0747ba67ebbb6ab2554020f84b99fd0537bb5c7288bba67e720f71452cc638cc"} Apr 16 18:35:08.714824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:08.714797 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_73953d20-d18b-4986-bfbc-086af34d2d27/storage-initializer/0.log" Apr 16 18:35:08.715220 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:08.714836 2578 generic.go:358] "Generic (PLEG): container finished" podID="73953d20-d18b-4986-bfbc-086af34d2d27" containerID="9a6a84dbee1e6e8ffc85812543fab4ddc0a9cb057e50e7747dcf8e8828dc115e" exitCode=1 Apr 16 18:35:08.715220 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:08.714917 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" event={"ID":"73953d20-d18b-4986-bfbc-086af34d2d27","Type":"ContainerDied","Data":"9a6a84dbee1e6e8ffc85812543fab4ddc0a9cb057e50e7747dcf8e8828dc115e"} Apr 16 18:35:09.719158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:09.719124 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_73953d20-d18b-4986-bfbc-086af34d2d27/storage-initializer/0.log" Apr 16 18:35:09.719557 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:09.719234 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" event={"ID":"73953d20-d18b-4986-bfbc-086af34d2d27","Type":"ContainerStarted","Data":"90a5246874852b79cd3d37f890b51b67fe96d80632b906c2ecc479e08b699e63"} Apr 16 18:35:10.723386 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:10.723360 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_73953d20-d18b-4986-bfbc-086af34d2d27/storage-initializer/1.log" Apr 16 18:35:10.723759 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:10.723677 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_73953d20-d18b-4986-bfbc-086af34d2d27/storage-initializer/0.log" Apr 16 18:35:10.723759 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:10.723713 2578 generic.go:358] "Generic (PLEG): container finished" podID="73953d20-d18b-4986-bfbc-086af34d2d27" containerID="90a5246874852b79cd3d37f890b51b67fe96d80632b906c2ecc479e08b699e63" exitCode=1 Apr 16 18:35:10.723851 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:10.723790 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" event={"ID":"73953d20-d18b-4986-bfbc-086af34d2d27","Type":"ContainerDied","Data":"90a5246874852b79cd3d37f890b51b67fe96d80632b906c2ecc479e08b699e63"} Apr 16 18:35:10.723851 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:10.723831 2578 scope.go:117] "RemoveContainer" containerID="9a6a84dbee1e6e8ffc85812543fab4ddc0a9cb057e50e7747dcf8e8828dc115e" Apr 16 18:35:10.724164 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:10.724148 2578 scope.go:117] "RemoveContainer" containerID="9a6a84dbee1e6e8ffc85812543fab4ddc0a9cb057e50e7747dcf8e8828dc115e" Apr 16 18:35:10.734041 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:35:10.734011 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_kserve-ci-e2e-test_73953d20-d18b-4986-bfbc-086af34d2d27_0 in pod sandbox 0747ba67ebbb6ab2554020f84b99fd0537bb5c7288bba67e720f71452cc638cc from index: no such id: '9a6a84dbee1e6e8ffc85812543fab4ddc0a9cb057e50e7747dcf8e8828dc115e'" containerID="9a6a84dbee1e6e8ffc85812543fab4ddc0a9cb057e50e7747dcf8e8828dc115e" Apr 16 18:35:10.734107 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:35:10.734070 2578 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_kserve-ci-e2e-test_73953d20-d18b-4986-bfbc-086af34d2d27_0 in pod sandbox 0747ba67ebbb6ab2554020f84b99fd0537bb5c7288bba67e720f71452cc638cc from index: no such id: '9a6a84dbee1e6e8ffc85812543fab4ddc0a9cb057e50e7747dcf8e8828dc115e'; Skipping pod \"isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_kserve-ci-e2e-test(73953d20-d18b-4986-bfbc-086af34d2d27)\"" logger="UnhandledError" Apr 16 18:35:10.735389 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:35:10.735370 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_kserve-ci-e2e-test(73953d20-d18b-4986-bfbc-086af34d2d27)\"" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" podUID="73953d20-d18b-4986-bfbc-086af34d2d27" Apr 16 18:35:11.728042 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:11.728015 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_73953d20-d18b-4986-bfbc-086af34d2d27/storage-initializer/1.log" Apr 16 18:35:20.540980 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.540951 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm"] Apr 16 18:35:20.609431 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.609398 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5"] Apr 16 18:35:20.609720 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.609694 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" containerID="cri-o://f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2" gracePeriod=30 Apr 16 18:35:20.669402 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.669360 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b"] Apr 16 18:35:20.674911 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.674885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:20.676910 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.676891 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-afc061\"" Apr 16 18:35:20.677029 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.676966 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-afc061-dockercfg-4v2gj\"" Apr 16 18:35:20.679952 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.679931 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_73953d20-d18b-4986-bfbc-086af34d2d27/storage-initializer/1.log" Apr 16 18:35:20.680075 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.679995 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:20.681279 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.681256 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b"] Apr 16 18:35:20.755758 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.755727 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm_73953d20-d18b-4986-bfbc-086af34d2d27/storage-initializer/1.log" Apr 16 18:35:20.755927 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.755841 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" Apr 16 18:35:20.755927 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.755870 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm" event={"ID":"73953d20-d18b-4986-bfbc-086af34d2d27","Type":"ContainerDied","Data":"0747ba67ebbb6ab2554020f84b99fd0537bb5c7288bba67e720f71452cc638cc"} Apr 16 18:35:20.755927 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.755913 2578 scope.go:117] "RemoveContainer" containerID="90a5246874852b79cd3d37f890b51b67fe96d80632b906c2ecc479e08b699e63" Apr 16 18:35:20.817217 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.817119 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/73953d20-d18b-4986-bfbc-086af34d2d27-cabundle-cert\") pod \"73953d20-d18b-4986-bfbc-086af34d2d27\" (UID: \"73953d20-d18b-4986-bfbc-086af34d2d27\") " Apr 16 18:35:20.817341 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.817322 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73953d20-d18b-4986-bfbc-086af34d2d27-kserve-provision-location\") pod \"73953d20-d18b-4986-bfbc-086af34d2d27\" (UID: \"73953d20-d18b-4986-bfbc-086af34d2d27\") " Apr 16 18:35:20.817440 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.817426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9db4763-8971-45ee-b653-e1591240c513-cabundle-cert\") pod \"isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b\" (UID: \"a9db4763-8971-45ee-b653-e1591240c513\") " pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:20.817513 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.817487 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9db4763-8971-45ee-b653-e1591240c513-kserve-provision-location\") pod \"isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b\" (UID: \"a9db4763-8971-45ee-b653-e1591240c513\") " pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:20.817619 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.817564 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73953d20-d18b-4986-bfbc-086af34d2d27-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "73953d20-d18b-4986-bfbc-086af34d2d27" (UID: "73953d20-d18b-4986-bfbc-086af34d2d27"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:35:20.817619 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.817502 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73953d20-d18b-4986-bfbc-086af34d2d27-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "73953d20-d18b-4986-bfbc-086af34d2d27" (UID: "73953d20-d18b-4986-bfbc-086af34d2d27"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:20.918738 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.918709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9db4763-8971-45ee-b653-e1591240c513-cabundle-cert\") pod \"isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b\" (UID: \"a9db4763-8971-45ee-b653-e1591240c513\") " pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:20.918932 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.918791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9db4763-8971-45ee-b653-e1591240c513-kserve-provision-location\") pod \"isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b\" (UID: \"a9db4763-8971-45ee-b653-e1591240c513\") " pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:20.918932 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.918833 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73953d20-d18b-4986-bfbc-086af34d2d27-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:35:20.918932 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.918847 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/73953d20-d18b-4986-bfbc-086af34d2d27-cabundle-cert\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:35:20.919244 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.919222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9db4763-8971-45ee-b653-e1591240c513-kserve-provision-location\") pod \"isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b\" (UID: \"a9db4763-8971-45ee-b653-e1591240c513\") " pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:20.919438 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.919416 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9db4763-8971-45ee-b653-e1591240c513-cabundle-cert\") pod \"isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b\" (UID: \"a9db4763-8971-45ee-b653-e1591240c513\") " pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:20.988920 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:20.988878 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:21.095626 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:21.095559 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm"] Apr 16 18:35:21.099785 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:21.099759 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-4e36ec-predictor-d847cd8cb-vjqgm"] Apr 16 18:35:21.110142 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:21.110113 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b"] Apr 16 18:35:21.112887 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:35:21.112861 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9db4763_8971_45ee_b653_e1591240c513.slice/crio-551447873a2b9f923db600b94bfd767bfc6cc80773bc422d103ecfe1b1744140 WatchSource:0}: Error finding container 551447873a2b9f923db600b94bfd767bfc6cc80773bc422d103ecfe1b1744140: Status 404 returned error can't find the container with id 551447873a2b9f923db600b94bfd767bfc6cc80773bc422d103ecfe1b1744140 Apr 16 18:35:21.648838 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:21.648803 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73953d20-d18b-4986-bfbc-086af34d2d27" path="/var/lib/kubelet/pods/73953d20-d18b-4986-bfbc-086af34d2d27/volumes" Apr 16 18:35:21.761427 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:21.761394 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" event={"ID":"a9db4763-8971-45ee-b653-e1591240c513","Type":"ContainerStarted","Data":"3d24fab624382c79a9453861acced9776d75b58462ffecf3214eaadfadf33c02"} Apr 16 18:35:21.761427 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:21.761427 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" event={"ID":"a9db4763-8971-45ee-b653-e1591240c513","Type":"ContainerStarted","Data":"551447873a2b9f923db600b94bfd767bfc6cc80773bc422d103ecfe1b1744140"} Apr 16 18:35:24.936206 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:24.936165 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" Apr 16 18:35:25.056585 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.056497 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1302f8-3cb2-46f5-bd14-1629ade13394-kserve-provision-location\") pod \"7e1302f8-3cb2-46f5-bd14-1629ade13394\" (UID: \"7e1302f8-3cb2-46f5-bd14-1629ade13394\") " Apr 16 18:35:25.056837 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.056816 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e1302f8-3cb2-46f5-bd14-1629ade13394-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e1302f8-3cb2-46f5-bd14-1629ade13394" (UID: "7e1302f8-3cb2-46f5-bd14-1629ade13394"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:35:25.157425 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.157388 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e1302f8-3cb2-46f5-bd14-1629ade13394-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:35:25.776047 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.776011 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerID="f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2" exitCode=0 Apr 16 18:35:25.776255 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.776095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" event={"ID":"7e1302f8-3cb2-46f5-bd14-1629ade13394","Type":"ContainerDied","Data":"f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2"} Apr 16 18:35:25.776255 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.776112 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" Apr 16 18:35:25.776255 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.776136 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5" event={"ID":"7e1302f8-3cb2-46f5-bd14-1629ade13394","Type":"ContainerDied","Data":"2b437b440b886c942840bd550f92e24ca9b51c5803baa3fe5612bf9981d1874d"} Apr 16 18:35:25.776255 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.776157 2578 scope.go:117] "RemoveContainer" containerID="f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2" Apr 16 18:35:25.783754 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.783735 2578 scope.go:117] "RemoveContainer" containerID="0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd" Apr 16 18:35:25.790309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.790294 2578 scope.go:117] "RemoveContainer" containerID="f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2" Apr 16 18:35:25.790768 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:35:25.790634 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2\": container with ID starting with f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2 not found: ID does not exist" containerID="f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2" Apr 16 18:35:25.790768 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.790670 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2"} err="failed to get container status \"f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2\": rpc error: code = NotFound desc = could not find container \"f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2\": container with ID starting with f3b9d6ea2971e9049d06a7a3737eb49826d8f784942e3429456121e55a10e6b2 not found: ID does not exist" Apr 16 18:35:25.790768 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.790693 2578 scope.go:117] "RemoveContainer" containerID="0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd" Apr 16 18:35:25.791114 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:35:25.791086 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd\": container with ID starting with 0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd not found: ID does not exist" containerID="0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd" Apr 16 18:35:25.791230 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.791120 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd"} err="failed to get container status \"0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd\": rpc error: code = NotFound desc = could not find container \"0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd\": container with ID starting with 0f4b8b68239bd58e82c36adb790a5677c9b9472b7826dec7569bfc1d69d421cd not found: ID does not exist" Apr 16 18:35:25.792387 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.792369 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5"] Apr 16 18:35:25.797056 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:25.797036 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-4e36ec-predictor-668b88d45c-jm7d5"] Apr 16 18:35:26.781986 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:26.781931 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b_a9db4763-8971-45ee-b653-e1591240c513/storage-initializer/0.log" Apr 16 18:35:26.781986 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:26.781982 2578 generic.go:358] "Generic (PLEG): container finished" podID="a9db4763-8971-45ee-b653-e1591240c513" containerID="3d24fab624382c79a9453861acced9776d75b58462ffecf3214eaadfadf33c02" exitCode=1 Apr 16 18:35:26.782397 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:26.782025 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" event={"ID":"a9db4763-8971-45ee-b653-e1591240c513","Type":"ContainerDied","Data":"3d24fab624382c79a9453861acced9776d75b58462ffecf3214eaadfadf33c02"} Apr 16 18:35:27.648085 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:27.648053 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" path="/var/lib/kubelet/pods/7e1302f8-3cb2-46f5-bd14-1629ade13394/volumes" Apr 16 18:35:27.786339 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:27.786315 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b_a9db4763-8971-45ee-b653-e1591240c513/storage-initializer/0.log" Apr 16 18:35:27.786756 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:27.786428 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" event={"ID":"a9db4763-8971-45ee-b653-e1591240c513","Type":"ContainerStarted","Data":"0c2538dfc3a86585e512201debc75a61e06118fa05a28142306716254836f31d"} Apr 16 18:35:30.782208 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.782159 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b"] Apr 16 18:35:30.782621 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.782481 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" podUID="a9db4763-8971-45ee-b653-e1591240c513" containerName="storage-initializer" containerID="cri-o://0c2538dfc3a86585e512201debc75a61e06118fa05a28142306716254836f31d" gracePeriod=30 Apr 16 18:35:30.797587 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.797564 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b_a9db4763-8971-45ee-b653-e1591240c513/storage-initializer/1.log" Apr 16 18:35:30.797981 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.797964 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b_a9db4763-8971-45ee-b653-e1591240c513/storage-initializer/0.log" Apr 16 18:35:30.798062 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.797998 2578 generic.go:358] "Generic (PLEG): container finished" podID="a9db4763-8971-45ee-b653-e1591240c513" containerID="0c2538dfc3a86585e512201debc75a61e06118fa05a28142306716254836f31d" exitCode=1 Apr 16 18:35:30.798109 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.798080 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" event={"ID":"a9db4763-8971-45ee-b653-e1591240c513","Type":"ContainerDied","Data":"0c2538dfc3a86585e512201debc75a61e06118fa05a28142306716254836f31d"} Apr 16 18:35:30.798157 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.798124 2578 scope.go:117] "RemoveContainer" containerID="3d24fab624382c79a9453861acced9776d75b58462ffecf3214eaadfadf33c02" Apr 16 18:35:30.885289 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885255 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv"] Apr 16 18:35:30.885739 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885717 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" Apr 16 18:35:30.885739 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885742 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" Apr 16 18:35:30.885938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885758 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73953d20-d18b-4986-bfbc-086af34d2d27" containerName="storage-initializer" Apr 16 18:35:30.885938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885765 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73953d20-d18b-4986-bfbc-086af34d2d27" containerName="storage-initializer" Apr 16 18:35:30.885938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885774 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73953d20-d18b-4986-bfbc-086af34d2d27" containerName="storage-initializer" Apr 16 18:35:30.885938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885783 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="73953d20-d18b-4986-bfbc-086af34d2d27" containerName="storage-initializer" Apr 16 18:35:30.885938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885793 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="storage-initializer" Apr 16 18:35:30.885938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885801 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="storage-initializer" Apr 16 18:35:30.885938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885873 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="73953d20-d18b-4986-bfbc-086af34d2d27" containerName="storage-initializer" Apr 16 18:35:30.885938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885885 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="73953d20-d18b-4986-bfbc-086af34d2d27" containerName="storage-initializer" Apr 16 18:35:30.885938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.885899 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e1302f8-3cb2-46f5-bd14-1629ade13394" containerName="kserve-container" Apr 16 18:35:30.890438 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.890419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" Apr 16 18:35:30.893008 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.892990 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-89kqm\"" Apr 16 18:35:30.900203 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.900166 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv"] Apr 16 18:35:30.918621 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.918602 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b_a9db4763-8971-45ee-b653-e1591240c513/storage-initializer/1.log" Apr 16 18:35:30.918716 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:30.918659 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:31.012275 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.012242 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9db4763-8971-45ee-b653-e1591240c513-cabundle-cert\") pod \"a9db4763-8971-45ee-b653-e1591240c513\" (UID: \"a9db4763-8971-45ee-b653-e1591240c513\") " Apr 16 18:35:31.012275 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.012278 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9db4763-8971-45ee-b653-e1591240c513-kserve-provision-location\") pod \"a9db4763-8971-45ee-b653-e1591240c513\" (UID: \"a9db4763-8971-45ee-b653-e1591240c513\") " Apr 16 18:35:31.012486 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.012453 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd3211c-6db3-4bbc-be48-21f7ce265512-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-8pqcv\" (UID: \"5dd3211c-6db3-4bbc-be48-21f7ce265512\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" Apr 16 18:35:31.012600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.012580 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9db4763-8971-45ee-b653-e1591240c513-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a9db4763-8971-45ee-b653-e1591240c513" (UID: "a9db4763-8971-45ee-b653-e1591240c513"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:35:31.012651 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.012600 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9db4763-8971-45ee-b653-e1591240c513-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a9db4763-8971-45ee-b653-e1591240c513" (UID: "a9db4763-8971-45ee-b653-e1591240c513"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:35:31.113706 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.113615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd3211c-6db3-4bbc-be48-21f7ce265512-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-8pqcv\" (UID: \"5dd3211c-6db3-4bbc-be48-21f7ce265512\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" Apr 16 18:35:31.113847 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.113710 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9db4763-8971-45ee-b653-e1591240c513-cabundle-cert\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:35:31.113847 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.113725 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9db4763-8971-45ee-b653-e1591240c513-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:35:31.113992 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.113976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd3211c-6db3-4bbc-be48-21f7ce265512-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-8pqcv\" (UID: \"5dd3211c-6db3-4bbc-be48-21f7ce265512\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" Apr 16 18:35:31.216871 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.216831 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" Apr 16 18:35:31.337132 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.337102 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv"] Apr 16 18:35:31.339416 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:35:31.339387 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd3211c_6db3_4bbc_be48_21f7ce265512.slice/crio-6bec246b271b864321ce61ee6e24b08ecf706df61bfa22735802b79cd7d1582d WatchSource:0}: Error finding container 6bec246b271b864321ce61ee6e24b08ecf706df61bfa22735802b79cd7d1582d: Status 404 returned error can't find the container with id 6bec246b271b864321ce61ee6e24b08ecf706df61bfa22735802b79cd7d1582d Apr 16 18:35:31.803148 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.803113 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" event={"ID":"5dd3211c-6db3-4bbc-be48-21f7ce265512","Type":"ContainerStarted","Data":"6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af"} Apr 16 18:35:31.803148 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.803150 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" event={"ID":"5dd3211c-6db3-4bbc-be48-21f7ce265512","Type":"ContainerStarted","Data":"6bec246b271b864321ce61ee6e24b08ecf706df61bfa22735802b79cd7d1582d"} Apr 16 18:35:31.804314 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.804296 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b_a9db4763-8971-45ee-b653-e1591240c513/storage-initializer/1.log" Apr 16 18:35:31.804417 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.804375 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" event={"ID":"a9db4763-8971-45ee-b653-e1591240c513","Type":"ContainerDied","Data":"551447873a2b9f923db600b94bfd767bfc6cc80773bc422d103ecfe1b1744140"} Apr 16 18:35:31.804417 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.804399 2578 scope.go:117] "RemoveContainer" containerID="0c2538dfc3a86585e512201debc75a61e06118fa05a28142306716254836f31d" Apr 16 18:35:31.804507 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.804417 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b" Apr 16 18:35:31.847249 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.847219 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b"] Apr 16 18:35:31.850043 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:31.850012 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afc061-predictor-5b7d9ccc6-9w59b"] Apr 16 18:35:33.647392 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:33.647359 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9db4763-8971-45ee-b653-e1591240c513" path="/var/lib/kubelet/pods/a9db4763-8971-45ee-b653-e1591240c513/volumes" Apr 16 18:35:35.818445 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:35.818412 2578 generic.go:358] "Generic (PLEG): container finished" podID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerID="6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af" exitCode=0 Apr 16 18:35:35.818797 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:35.818482 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" event={"ID":"5dd3211c-6db3-4bbc-be48-21f7ce265512","Type":"ContainerDied","Data":"6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af"} Apr 16 18:35:53.887218 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:53.887113 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" event={"ID":"5dd3211c-6db3-4bbc-be48-21f7ce265512","Type":"ContainerStarted","Data":"313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc"} Apr 16 18:35:53.887566 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:53.887413 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" Apr 16 18:35:53.888563 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:53.888541 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:35:53.904984 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:53.904927 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podStartSLOduration=6.107260364 podStartE2EDuration="23.904910032s" podCreationTimestamp="2026-04-16 18:35:30 +0000 UTC" firstStartedPulling="2026-04-16 18:35:35.819577522 +0000 UTC m=+1930.763190556" lastFinishedPulling="2026-04-16 18:35:53.617227189 +0000 UTC m=+1948.560840224" observedRunningTime="2026-04-16 18:35:53.902863114 +0000 UTC m=+1948.846476170" watchObservedRunningTime="2026-04-16 18:35:53.904910032 +0000 UTC m=+1948.848523088" Apr 16 18:35:54.890925 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:35:54.890887 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:36:04.891847 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:36:04.891807 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:36:14.891522 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:36:14.891473 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:36:24.891033 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:36:24.890984 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:36:34.891910 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:36:34.891861 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:36:44.891271 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:36:44.891221 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:36:54.891771 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:36:54.891719 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:36:55.651938 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:36:55.651885 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:37:05.649559 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:05.649515 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:37:15.650359 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:15.650328 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" Apr 16 18:37:21.022416 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.022376 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv"] Apr 16 18:37:21.022891 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.022659 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" containerID="cri-o://313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc" gracePeriod=30 Apr 16 18:37:21.122970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.122929 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh"] Apr 16 18:37:21.123376 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.123358 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9db4763-8971-45ee-b653-e1591240c513" containerName="storage-initializer" Apr 16 18:37:21.123475 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.123378 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9db4763-8971-45ee-b653-e1591240c513" containerName="storage-initializer" Apr 16 18:37:21.123475 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.123396 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9db4763-8971-45ee-b653-e1591240c513" containerName="storage-initializer" Apr 16 18:37:21.123475 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.123406 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9db4763-8971-45ee-b653-e1591240c513" containerName="storage-initializer" Apr 16 18:37:21.123475 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.123470 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9db4763-8971-45ee-b653-e1591240c513" containerName="storage-initializer" Apr 16 18:37:21.123679 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.123485 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9db4763-8971-45ee-b653-e1591240c513" containerName="storage-initializer" Apr 16 18:37:21.126395 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.126375 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" Apr 16 18:37:21.133854 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.133832 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh"] Apr 16 18:37:21.254314 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.254280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40447185-02f2-47ec-8dcb-625fbc05fe86-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh\" (UID: \"40447185-02f2-47ec-8dcb-625fbc05fe86\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" Apr 16 18:37:21.355681 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.355589 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40447185-02f2-47ec-8dcb-625fbc05fe86-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh\" (UID: \"40447185-02f2-47ec-8dcb-625fbc05fe86\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" Apr 16 18:37:21.356005 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.355984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40447185-02f2-47ec-8dcb-625fbc05fe86-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh\" (UID: \"40447185-02f2-47ec-8dcb-625fbc05fe86\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" Apr 16 18:37:21.436453 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.436413 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" Apr 16 18:37:21.553639 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:21.553603 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh"] Apr 16 18:37:21.556623 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:37:21.556579 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40447185_02f2_47ec_8dcb_625fbc05fe86.slice/crio-56328744cc28e193a7f36f041176ed8b59f56fa7cc11d4e837a88c58a8cf4f27 WatchSource:0}: Error finding container 56328744cc28e193a7f36f041176ed8b59f56fa7cc11d4e837a88c58a8cf4f27: Status 404 returned error can't find the container with id 56328744cc28e193a7f36f041176ed8b59f56fa7cc11d4e837a88c58a8cf4f27 Apr 16 18:37:22.155945 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:22.155911 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" event={"ID":"40447185-02f2-47ec-8dcb-625fbc05fe86","Type":"ContainerStarted","Data":"5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777"} Apr 16 18:37:22.155945 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:22.155950 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" event={"ID":"40447185-02f2-47ec-8dcb-625fbc05fe86","Type":"ContainerStarted","Data":"56328744cc28e193a7f36f041176ed8b59f56fa7cc11d4e837a88c58a8cf4f27"} Apr 16 18:37:25.649482 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:25.649437 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.43:8080: connect: connection refused" Apr 16 18:37:25.757747 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:25.757724 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" Apr 16 18:37:25.895260 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:25.895159 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd3211c-6db3-4bbc-be48-21f7ce265512-kserve-provision-location\") pod \"5dd3211c-6db3-4bbc-be48-21f7ce265512\" (UID: \"5dd3211c-6db3-4bbc-be48-21f7ce265512\") " Apr 16 18:37:25.895485 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:25.895460 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd3211c-6db3-4bbc-be48-21f7ce265512-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5dd3211c-6db3-4bbc-be48-21f7ce265512" (UID: "5dd3211c-6db3-4bbc-be48-21f7ce265512"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:37:25.996580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:25.996538 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5dd3211c-6db3-4bbc-be48-21f7ce265512-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:37:26.168730 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.168632 2578 generic.go:358] "Generic (PLEG): container finished" podID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerID="313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc" exitCode=0 Apr 16 18:37:26.168730 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.168702 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" Apr 16 18:37:26.168730 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.168713 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" event={"ID":"5dd3211c-6db3-4bbc-be48-21f7ce265512","Type":"ContainerDied","Data":"313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc"} Apr 16 18:37:26.169003 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.168753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv" event={"ID":"5dd3211c-6db3-4bbc-be48-21f7ce265512","Type":"ContainerDied","Data":"6bec246b271b864321ce61ee6e24b08ecf706df61bfa22735802b79cd7d1582d"} Apr 16 18:37:26.169003 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.168770 2578 scope.go:117] "RemoveContainer" containerID="313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc" Apr 16 18:37:26.170150 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.170131 2578 generic.go:358] "Generic (PLEG): container finished" podID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerID="5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777" exitCode=0 Apr 16 18:37:26.170150 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.170158 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" event={"ID":"40447185-02f2-47ec-8dcb-625fbc05fe86","Type":"ContainerDied","Data":"5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777"} Apr 16 18:37:26.177402 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.177380 2578 scope.go:117] "RemoveContainer" containerID="6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af" Apr 16 18:37:26.184538 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.184489 2578 scope.go:117] "RemoveContainer" containerID="313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc" Apr 16 18:37:26.184772 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:37:26.184752 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc\": container with ID starting with 313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc not found: ID does not exist" containerID="313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc" Apr 16 18:37:26.184850 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.184784 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc"} err="failed to get container status \"313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc\": rpc error: code = NotFound desc = could not find container \"313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc\": container with ID starting with 313701bf4809d8b5918a2fcab65da2336c6207d7efa179a17aaa3a78e6bbc6bc not found: ID does not exist" Apr 16 18:37:26.184850 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.184809 2578 scope.go:117] "RemoveContainer" containerID="6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af" Apr 16 18:37:26.185060 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:37:26.185043 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af\": container with ID starting with 6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af not found: ID does not exist" containerID="6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af" Apr 16 18:37:26.185115 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.185067 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af"} err="failed to get container status \"6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af\": rpc error: code = NotFound desc = could not find container \"6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af\": container with ID starting with 6ecba654abfd92f744a3ae8aa2d4ac6df4cb0c07003e85949672f3a9e71233af not found: ID does not exist" Apr 16 18:37:26.198601 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.198579 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv"] Apr 16 18:37:26.204578 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:26.204558 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8pqcv"] Apr 16 18:37:27.175143 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:27.175109 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" event={"ID":"40447185-02f2-47ec-8dcb-625fbc05fe86","Type":"ContainerStarted","Data":"720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603"} Apr 16 18:37:27.175642 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:27.175433 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" Apr 16 18:37:27.176936 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:27.176911 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:37:27.193977 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:27.193906 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podStartSLOduration=6.193892495 podStartE2EDuration="6.193892495s" podCreationTimestamp="2026-04-16 18:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:37:27.19252258 +0000 UTC m=+2042.136135636" watchObservedRunningTime="2026-04-16 18:37:27.193892495 +0000 UTC m=+2042.137505550" Apr 16 18:37:27.653441 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:27.653404 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" path="/var/lib/kubelet/pods/5dd3211c-6db3-4bbc-be48-21f7ce265512/volumes" Apr 16 18:37:28.178953 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:28.178916 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:37:38.179625 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:38.179533 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:37:48.179319 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:48.179274 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:37:58.179796 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:37:58.179747 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:38:08.179659 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:08.179605 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:38:18.179460 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:18.179415 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:38:28.179325 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:28.179274 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:38:35.646471 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:35.646431 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:38:45.647964 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:45.647936 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" Apr 16 18:38:51.239944 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.239912 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh"] Apr 16 18:38:51.240395 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.240218 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" containerID="cri-o://720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603" gracePeriod=30 Apr 16 18:38:51.333847 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.333811 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b"] Apr 16 18:38:51.334160 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.334147 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" Apr 16 18:38:51.334227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.334162 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" Apr 16 18:38:51.334227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.334176 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="storage-initializer" Apr 16 18:38:51.334227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.334181 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="storage-initializer" Apr 16 18:38:51.334330 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.334274 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dd3211c-6db3-4bbc-be48-21f7ce265512" containerName="kserve-container" Apr 16 18:38:51.337243 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.337228 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" Apr 16 18:38:51.345217 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.345171 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b"] Apr 16 18:38:51.404707 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.404674 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8d2768-e34c-4a14-9d81-6b2ee2845b94-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-rkf5b\" (UID: \"ca8d2768-e34c-4a14-9d81-6b2ee2845b94\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" Apr 16 18:38:51.505816 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.505721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8d2768-e34c-4a14-9d81-6b2ee2845b94-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-rkf5b\" (UID: \"ca8d2768-e34c-4a14-9d81-6b2ee2845b94\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" Apr 16 18:38:51.506125 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.506107 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8d2768-e34c-4a14-9d81-6b2ee2845b94-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-rkf5b\" (UID: \"ca8d2768-e34c-4a14-9d81-6b2ee2845b94\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" Apr 16 18:38:51.647880 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.647853 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" Apr 16 18:38:51.766125 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:51.766094 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b"] Apr 16 18:38:51.768655 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:38:51.768631 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca8d2768_e34c_4a14_9d81_6b2ee2845b94.slice/crio-b22e23fcdf673144ffff4a51cc16f6f20ce4edd81cb547816955a1ff8556f971 WatchSource:0}: Error finding container b22e23fcdf673144ffff4a51cc16f6f20ce4edd81cb547816955a1ff8556f971: Status 404 returned error can't find the container with id b22e23fcdf673144ffff4a51cc16f6f20ce4edd81cb547816955a1ff8556f971 Apr 16 18:38:52.430263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:52.430227 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" event={"ID":"ca8d2768-e34c-4a14-9d81-6b2ee2845b94","Type":"ContainerStarted","Data":"2f2b841accfb5e811a0c15b333949332bd77a3c98c4569111ccff38825ebc7e5"} Apr 16 18:38:52.430263 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:52.430268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" event={"ID":"ca8d2768-e34c-4a14-9d81-6b2ee2845b94","Type":"ContainerStarted","Data":"b22e23fcdf673144ffff4a51cc16f6f20ce4edd81cb547816955a1ff8556f971"} Apr 16 18:38:55.646887 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:55.646850 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 16 18:38:55.984747 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:55.984724 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" Apr 16 18:38:56.046994 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.046963 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40447185-02f2-47ec-8dcb-625fbc05fe86-kserve-provision-location\") pod \"40447185-02f2-47ec-8dcb-625fbc05fe86\" (UID: \"40447185-02f2-47ec-8dcb-625fbc05fe86\") " Apr 16 18:38:56.047333 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.047312 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40447185-02f2-47ec-8dcb-625fbc05fe86-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "40447185-02f2-47ec-8dcb-625fbc05fe86" (UID: "40447185-02f2-47ec-8dcb-625fbc05fe86"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:56.148073 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.147984 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40447185-02f2-47ec-8dcb-625fbc05fe86-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:38:56.443076 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.442992 2578 generic.go:358] "Generic (PLEG): container finished" podID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerID="720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603" exitCode=0 Apr 16 18:38:56.443076 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.443058 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" Apr 16 18:38:56.443076 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.443067 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" event={"ID":"40447185-02f2-47ec-8dcb-625fbc05fe86","Type":"ContainerDied","Data":"720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603"} Apr 16 18:38:56.443380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.443103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh" event={"ID":"40447185-02f2-47ec-8dcb-625fbc05fe86","Type":"ContainerDied","Data":"56328744cc28e193a7f36f041176ed8b59f56fa7cc11d4e837a88c58a8cf4f27"} Apr 16 18:38:56.443380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.443119 2578 scope.go:117] "RemoveContainer" containerID="720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603" Apr 16 18:38:56.444584 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.444562 2578 generic.go:358] "Generic (PLEG): container finished" podID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerID="2f2b841accfb5e811a0c15b333949332bd77a3c98c4569111ccff38825ebc7e5" exitCode=0 Apr 16 18:38:56.444696 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.444612 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" event={"ID":"ca8d2768-e34c-4a14-9d81-6b2ee2845b94","Type":"ContainerDied","Data":"2f2b841accfb5e811a0c15b333949332bd77a3c98c4569111ccff38825ebc7e5"} Apr 16 18:38:56.453667 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.453294 2578 scope.go:117] "RemoveContainer" containerID="5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777" Apr 16 18:38:56.461247 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.461229 2578 scope.go:117] "RemoveContainer" containerID="720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603" Apr 16 18:38:56.461775 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:38:56.461723 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603\": container with ID starting with 720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603 not found: ID does not exist" containerID="720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603" Apr 16 18:38:56.461775 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.461760 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603"} err="failed to get container status \"720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603\": rpc error: code = NotFound desc = could not find container \"720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603\": container with ID starting with 720a8057bd82a64e7e22bafcba3b3ed885199a293e2d4e0a4879a04637c19603 not found: ID does not exist" Apr 16 18:38:56.461988 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.461784 2578 scope.go:117] "RemoveContainer" containerID="5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777" Apr 16 18:38:56.462175 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:38:56.462142 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777\": container with ID starting with 5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777 not found: ID does not exist" containerID="5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777" Apr 16 18:38:56.462266 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.462177 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777"} err="failed to get container status \"5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777\": rpc error: code = NotFound desc = could not find container \"5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777\": container with ID starting with 5dac4e7df07e93b9af661e2c0a2019361167711ceea8a9fdfa4fe47a43dcf777 not found: ID does not exist" Apr 16 18:38:56.473035 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.472969 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh"] Apr 16 18:38:56.474848 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:56.474825 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-wr5xh"] Apr 16 18:38:57.449436 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:57.449406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" event={"ID":"ca8d2768-e34c-4a14-9d81-6b2ee2845b94","Type":"ContainerStarted","Data":"0bb7d82f2bf0269e18b9a8208fc7d335a73b2718b1455985abfec74049b802fb"} Apr 16 18:38:57.449866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:57.449697 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" Apr 16 18:38:57.451136 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:57.451112 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:38:57.466427 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:57.466385 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podStartSLOduration=6.466373253 podStartE2EDuration="6.466373253s" podCreationTimestamp="2026-04-16 18:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:38:57.465062915 +0000 UTC m=+2132.408675971" watchObservedRunningTime="2026-04-16 18:38:57.466373253 +0000 UTC m=+2132.409986308" Apr 16 18:38:57.647811 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:57.647780 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" path="/var/lib/kubelet/pods/40447185-02f2-47ec-8dcb-625fbc05fe86/volumes" Apr 16 18:38:58.453699 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:38:58.453661 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:39:08.454328 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:39:08.454233 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:39:18.453890 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:39:18.453849 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:39:28.454412 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:39:28.454374 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:39:38.453640 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:39:38.453603 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:39:48.454318 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:39:48.454278 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:39:58.454315 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:39:58.454271 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:40:01.644892 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:01.644846 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:40:11.647340 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:11.647313 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" Apr 16 18:40:21.501587 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.501550 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b"] Apr 16 18:40:21.502132 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.501817 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" containerID="cri-o://0bb7d82f2bf0269e18b9a8208fc7d335a73b2718b1455985abfec74049b802fb" gracePeriod=30 Apr 16 18:40:21.602920 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.602882 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn"] Apr 16 18:40:21.603231 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.603217 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="storage-initializer" Apr 16 18:40:21.603231 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.603231 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="storage-initializer" Apr 16 18:40:21.603330 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.603247 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" Apr 16 18:40:21.603330 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.603253 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" Apr 16 18:40:21.603330 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.603311 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="40447185-02f2-47ec-8dcb-625fbc05fe86" containerName="kserve-container" Apr 16 18:40:21.606313 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.606295 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" Apr 16 18:40:21.614998 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.614967 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn"] Apr 16 18:40:21.644802 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.644766 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 16 18:40:21.784017 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.783929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15ce4e3b-5d2e-4dc4-8a36-0319792e1124-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn\" (UID: \"15ce4e3b-5d2e-4dc4-8a36-0319792e1124\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" Apr 16 18:40:21.885635 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.885584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15ce4e3b-5d2e-4dc4-8a36-0319792e1124-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn\" (UID: \"15ce4e3b-5d2e-4dc4-8a36-0319792e1124\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" Apr 16 18:40:21.886000 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.885979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15ce4e3b-5d2e-4dc4-8a36-0319792e1124-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn\" (UID: \"15ce4e3b-5d2e-4dc4-8a36-0319792e1124\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" Apr 16 18:40:21.916640 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:21.916613 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" Apr 16 18:40:22.035407 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:22.035331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn"] Apr 16 18:40:22.039007 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:40:22.038975 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15ce4e3b_5d2e_4dc4_8a36_0319792e1124.slice/crio-5dbb0faf3463cf51b9e568402da388170eed79a323970278b7b0b828114d13cb WatchSource:0}: Error finding container 5dbb0faf3463cf51b9e568402da388170eed79a323970278b7b0b828114d13cb: Status 404 returned error can't find the container with id 5dbb0faf3463cf51b9e568402da388170eed79a323970278b7b0b828114d13cb Apr 16 18:40:22.043996 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:22.043977 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:40:22.699123 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:22.699080 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" event={"ID":"15ce4e3b-5d2e-4dc4-8a36-0319792e1124","Type":"ContainerStarted","Data":"d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d"} Apr 16 18:40:22.699123 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:22.699123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" event={"ID":"15ce4e3b-5d2e-4dc4-8a36-0319792e1124","Type":"ContainerStarted","Data":"5dbb0faf3463cf51b9e568402da388170eed79a323970278b7b0b828114d13cb"} Apr 16 18:40:25.709173 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:25.709144 2578 generic.go:358] "Generic (PLEG): container finished" podID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerID="d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d" exitCode=0 Apr 16 18:40:25.709502 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:25.709224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" event={"ID":"15ce4e3b-5d2e-4dc4-8a36-0319792e1124","Type":"ContainerDied","Data":"d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d"} Apr 16 18:40:26.713165 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:26.713137 2578 generic.go:358] "Generic (PLEG): container finished" podID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerID="0bb7d82f2bf0269e18b9a8208fc7d335a73b2718b1455985abfec74049b802fb" exitCode=0 Apr 16 18:40:26.713539 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:26.713225 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" event={"ID":"ca8d2768-e34c-4a14-9d81-6b2ee2845b94","Type":"ContainerDied","Data":"0bb7d82f2bf0269e18b9a8208fc7d335a73b2718b1455985abfec74049b802fb"} Apr 16 18:40:26.714596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:26.714576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" event={"ID":"15ce4e3b-5d2e-4dc4-8a36-0319792e1124","Type":"ContainerStarted","Data":"cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf"} Apr 16 18:40:26.715096 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:26.714792 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" Apr 16 18:40:26.732779 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:26.732725 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" podStartSLOduration=5.732708262 podStartE2EDuration="5.732708262s" podCreationTimestamp="2026-04-16 18:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:40:26.729751605 +0000 UTC m=+2221.673364671" watchObservedRunningTime="2026-04-16 18:40:26.732708262 +0000 UTC m=+2221.676321321" Apr 16 18:40:26.840549 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:26.840526 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" Apr 16 18:40:26.924231 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:26.924201 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8d2768-e34c-4a14-9d81-6b2ee2845b94-kserve-provision-location\") pod \"ca8d2768-e34c-4a14-9d81-6b2ee2845b94\" (UID: \"ca8d2768-e34c-4a14-9d81-6b2ee2845b94\") " Apr 16 18:40:26.924525 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:26.924501 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca8d2768-e34c-4a14-9d81-6b2ee2845b94-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca8d2768-e34c-4a14-9d81-6b2ee2845b94" (UID: "ca8d2768-e34c-4a14-9d81-6b2ee2845b94"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:40:27.024816 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:27.024727 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8d2768-e34c-4a14-9d81-6b2ee2845b94-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:40:27.719868 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:27.719843 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" Apr 16 18:40:27.720338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:27.719839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b" event={"ID":"ca8d2768-e34c-4a14-9d81-6b2ee2845b94","Type":"ContainerDied","Data":"b22e23fcdf673144ffff4a51cc16f6f20ce4edd81cb547816955a1ff8556f971"} Apr 16 18:40:27.720338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:27.719960 2578 scope.go:117] "RemoveContainer" containerID="0bb7d82f2bf0269e18b9a8208fc7d335a73b2718b1455985abfec74049b802fb" Apr 16 18:40:27.727792 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:27.727768 2578 scope.go:117] "RemoveContainer" containerID="2f2b841accfb5e811a0c15b333949332bd77a3c98c4569111ccff38825ebc7e5" Apr 16 18:40:27.751305 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:27.751281 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b"] Apr 16 18:40:27.754958 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:27.754935 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-rkf5b"] Apr 16 18:40:29.648088 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:29.648054 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" path="/var/lib/kubelet/pods/ca8d2768-e34c-4a14-9d81-6b2ee2845b94/volumes" Apr 16 18:40:57.721778 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:40:57.721734 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 18:41:07.721091 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:07.721042 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 18:41:17.721018 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:17.720975 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 18:41:27.721064 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:27.721016 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 18:41:29.644693 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:29.644650 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.46:8080: connect: connection refused" Apr 16 18:41:39.649681 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:39.649654 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" Apr 16 18:41:41.663851 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.663810 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn"] Apr 16 18:41:41.664358 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.664171 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="kserve-container" containerID="cri-o://cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf" gracePeriod=30 Apr 16 18:41:41.742460 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.742429 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg"] Apr 16 18:41:41.742846 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.742830 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" Apr 16 18:41:41.742933 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.742849 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" Apr 16 18:41:41.742933 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.742869 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="storage-initializer" Apr 16 18:41:41.742933 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.742879 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="storage-initializer" Apr 16 18:41:41.743090 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.742953 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca8d2768-e34c-4a14-9d81-6b2ee2845b94" containerName="kserve-container" Apr 16 18:41:41.746287 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.746267 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" Apr 16 18:41:41.755428 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.755401 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg"] Apr 16 18:41:41.863005 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.862973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddf4374b-a0e6-42d6-911e-f6c9b3e391a1-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg\" (UID: \"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" Apr 16 18:41:41.963865 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.963831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddf4374b-a0e6-42d6-911e-f6c9b3e391a1-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg\" (UID: \"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" Apr 16 18:41:41.964240 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:41.964222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddf4374b-a0e6-42d6-911e-f6c9b3e391a1-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg\" (UID: \"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" Apr 16 18:41:42.056056 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:42.056025 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" Apr 16 18:41:42.176537 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:42.176505 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg"] Apr 16 18:41:42.179949 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:41:42.179921 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddf4374b_a0e6_42d6_911e_f6c9b3e391a1.slice/crio-ea7a610678567de3646e617f2a91a0bc9b4b8c020b756bf98b5eccafe8130d4d WatchSource:0}: Error finding container ea7a610678567de3646e617f2a91a0bc9b4b8c020b756bf98b5eccafe8130d4d: Status 404 returned error can't find the container with id ea7a610678567de3646e617f2a91a0bc9b4b8c020b756bf98b5eccafe8130d4d Apr 16 18:41:42.958633 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:42.958589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" event={"ID":"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1","Type":"ContainerStarted","Data":"2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607"} Apr 16 18:41:42.958633 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:42.958635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" event={"ID":"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1","Type":"ContainerStarted","Data":"ea7a610678567de3646e617f2a91a0bc9b4b8c020b756bf98b5eccafe8130d4d"} Apr 16 18:41:46.392399 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.392377 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" Apr 16 18:41:46.507670 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.507633 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15ce4e3b-5d2e-4dc4-8a36-0319792e1124-kserve-provision-location\") pod \"15ce4e3b-5d2e-4dc4-8a36-0319792e1124\" (UID: \"15ce4e3b-5d2e-4dc4-8a36-0319792e1124\") " Apr 16 18:41:46.507954 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.507928 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15ce4e3b-5d2e-4dc4-8a36-0319792e1124-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15ce4e3b-5d2e-4dc4-8a36-0319792e1124" (UID: "15ce4e3b-5d2e-4dc4-8a36-0319792e1124"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:46.608249 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.608148 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15ce4e3b-5d2e-4dc4-8a36-0319792e1124-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:41:46.976908 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.976871 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerID="2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607" exitCode=0 Apr 16 18:41:46.977106 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.976904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" event={"ID":"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1","Type":"ContainerDied","Data":"2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607"} Apr 16 18:41:46.978636 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.978609 2578 generic.go:358] "Generic (PLEG): container finished" podID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerID="cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf" exitCode=0 Apr 16 18:41:46.978757 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.978656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" event={"ID":"15ce4e3b-5d2e-4dc4-8a36-0319792e1124","Type":"ContainerDied","Data":"cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf"} Apr 16 18:41:46.978757 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.978679 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" event={"ID":"15ce4e3b-5d2e-4dc4-8a36-0319792e1124","Type":"ContainerDied","Data":"5dbb0faf3463cf51b9e568402da388170eed79a323970278b7b0b828114d13cb"} Apr 16 18:41:46.978757 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.978694 2578 scope.go:117] "RemoveContainer" containerID="cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf" Apr 16 18:41:46.978757 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.978694 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn" Apr 16 18:41:46.987057 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.987044 2578 scope.go:117] "RemoveContainer" containerID="d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d" Apr 16 18:41:46.994972 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.994945 2578 scope.go:117] "RemoveContainer" containerID="cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf" Apr 16 18:41:46.995256 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:41:46.995236 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf\": container with ID starting with cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf not found: ID does not exist" containerID="cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf" Apr 16 18:41:46.995354 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.995270 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf"} err="failed to get container status \"cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf\": rpc error: code = NotFound desc = could not find container \"cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf\": container with ID starting with cedd5a229cb057bc6b7ba474144f329dfa2ef6225de898b30736708a544372bf not found: ID does not exist" Apr 16 18:41:46.995354 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.995299 2578 scope.go:117] "RemoveContainer" containerID="d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d" Apr 16 18:41:46.995592 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:41:46.995574 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d\": container with ID starting with d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d not found: ID does not exist" containerID="d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d" Apr 16 18:41:46.995647 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:46.995595 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d"} err="failed to get container status \"d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d\": rpc error: code = NotFound desc = could not find container \"d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d\": container with ID starting with d04840a2b52d9a898c3a7a5d46d7f69c69c2f40c6dc4db8487f6d7964cd8927d not found: ID does not exist" Apr 16 18:41:47.005426 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:47.005405 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn"] Apr 16 18:41:47.017853 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:47.017829 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-9kdtn"] Apr 16 18:41:47.648019 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:47.647989 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" path="/var/lib/kubelet/pods/15ce4e3b-5d2e-4dc4-8a36-0319792e1124/volumes" Apr 16 18:41:47.983141 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:47.983113 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" event={"ID":"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1","Type":"ContainerStarted","Data":"c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96"} Apr 16 18:41:47.983385 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:47.983365 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" Apr 16 18:41:48.000135 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:41:48.000086 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" podStartSLOduration=7.00007263 podStartE2EDuration="7.00007263s" podCreationTimestamp="2026-04-16 18:41:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:47.999397771 +0000 UTC m=+2302.943020435" watchObservedRunningTime="2026-04-16 18:41:48.00007263 +0000 UTC m=+2302.943685705" Apr 16 18:42:18.991148 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:42:18.991046 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 18:42:28.990612 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:42:28.990568 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 18:42:38.990205 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:42:38.990161 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 18:42:48.990930 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:42:48.990884 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 18:42:58.990388 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:42:58.990344 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 18:43:02.647956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:02.647919 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" Apr 16 18:43:12.012882 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.012835 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg"] Apr 16 18:43:12.013395 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.013111 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" containerID="cri-o://c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96" gracePeriod=30 Apr 16 18:43:12.079527 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.079495 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt"] Apr 16 18:43:12.079825 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.079813 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="storage-initializer" Apr 16 18:43:12.079873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.079826 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="storage-initializer" Apr 16 18:43:12.079873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.079860 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="kserve-container" Apr 16 18:43:12.079873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.079866 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="kserve-container" Apr 16 18:43:12.079966 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.079918 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15ce4e3b-5d2e-4dc4-8a36-0319792e1124" containerName="kserve-container" Apr 16 18:43:12.082915 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.082900 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" Apr 16 18:43:12.091404 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.091377 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt"] Apr 16 18:43:12.123541 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.123512 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b4d9a6d-2f28-4370-8fc4-977debbc24f4-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt\" (UID: \"7b4d9a6d-2f28-4370-8fc4-977debbc24f4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" Apr 16 18:43:12.224868 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.224821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b4d9a6d-2f28-4370-8fc4-977debbc24f4-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt\" (UID: \"7b4d9a6d-2f28-4370-8fc4-977debbc24f4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" Apr 16 18:43:12.225212 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.225162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b4d9a6d-2f28-4370-8fc4-977debbc24f4-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt\" (UID: \"7b4d9a6d-2f28-4370-8fc4-977debbc24f4\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" Apr 16 18:43:12.393358 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.393260 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" Apr 16 18:43:12.511986 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.511867 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt"] Apr 16 18:43:12.514920 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:43:12.514895 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b4d9a6d_2f28_4370_8fc4_977debbc24f4.slice/crio-1ab1283999c4056e3b1657b579311beda8ba30d4a1e5403b2b2be0cb027e5c1b WatchSource:0}: Error finding container 1ab1283999c4056e3b1657b579311beda8ba30d4a1e5403b2b2be0cb027e5c1b: Status 404 returned error can't find the container with id 1ab1283999c4056e3b1657b579311beda8ba30d4a1e5403b2b2be0cb027e5c1b Apr 16 18:43:12.645268 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:12.645151 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 16 18:43:13.237237 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:13.237204 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" event={"ID":"7b4d9a6d-2f28-4370-8fc4-977debbc24f4","Type":"ContainerStarted","Data":"743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0"} Apr 16 18:43:13.237237 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:13.237243 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" event={"ID":"7b4d9a6d-2f28-4370-8fc4-977debbc24f4","Type":"ContainerStarted","Data":"1ab1283999c4056e3b1657b579311beda8ba30d4a1e5403b2b2be0cb027e5c1b"} Apr 16 18:43:16.855101 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:16.855077 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" Apr 16 18:43:16.967536 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:16.967510 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddf4374b-a0e6-42d6-911e-f6c9b3e391a1-kserve-provision-location\") pod \"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1\" (UID: \"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1\") " Apr 16 18:43:16.967827 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:16.967803 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddf4374b-a0e6-42d6-911e-f6c9b3e391a1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" (UID: "ddf4374b-a0e6-42d6-911e-f6c9b3e391a1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:17.068380 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.068346 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ddf4374b-a0e6-42d6-911e-f6c9b3e391a1-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:43:17.251422 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.251343 2578 generic.go:358] "Generic (PLEG): container finished" podID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerID="c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96" exitCode=0 Apr 16 18:43:17.251422 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.251414 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" Apr 16 18:43:17.251648 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.251420 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" event={"ID":"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1","Type":"ContainerDied","Data":"c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96"} Apr 16 18:43:17.251648 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.251454 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg" event={"ID":"ddf4374b-a0e6-42d6-911e-f6c9b3e391a1","Type":"ContainerDied","Data":"ea7a610678567de3646e617f2a91a0bc9b4b8c020b756bf98b5eccafe8130d4d"} Apr 16 18:43:17.251648 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.251474 2578 scope.go:117] "RemoveContainer" containerID="c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96" Apr 16 18:43:17.252760 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.252737 2578 generic.go:358] "Generic (PLEG): container finished" podID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerID="743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0" exitCode=0 Apr 16 18:43:17.252888 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.252792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" event={"ID":"7b4d9a6d-2f28-4370-8fc4-977debbc24f4","Type":"ContainerDied","Data":"743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0"} Apr 16 18:43:17.260170 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.260103 2578 scope.go:117] "RemoveContainer" containerID="2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607" Apr 16 18:43:17.268034 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.268015 2578 scope.go:117] "RemoveContainer" containerID="c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96" Apr 16 18:43:17.268325 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:43:17.268299 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96\": container with ID starting with c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96 not found: ID does not exist" containerID="c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96" Apr 16 18:43:17.268441 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.268332 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96"} err="failed to get container status \"c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96\": rpc error: code = NotFound desc = could not find container \"c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96\": container with ID starting with c21cbb9f1bdd837b96b4379f4979702cf9c03d41bb0c9d32bd25de5a3aaf9b96 not found: ID does not exist" Apr 16 18:43:17.268441 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.268356 2578 scope.go:117] "RemoveContainer" containerID="2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607" Apr 16 18:43:17.268615 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:43:17.268597 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607\": container with ID starting with 2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607 not found: ID does not exist" containerID="2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607" Apr 16 18:43:17.268683 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.268621 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607"} err="failed to get container status \"2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607\": rpc error: code = NotFound desc = could not find container \"2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607\": container with ID starting with 2480779252e6a4fce88d38c116ba8e9ddde1b7bc1306c4e2151d08075340e607 not found: ID does not exist" Apr 16 18:43:17.292244 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.292220 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg"] Apr 16 18:43:17.298301 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.298281 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-8w5zg"] Apr 16 18:43:17.648555 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:17.648475 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" path="/var/lib/kubelet/pods/ddf4374b-a0e6-42d6-911e-f6c9b3e391a1/volumes" Apr 16 18:43:18.257509 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:18.257479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" event={"ID":"7b4d9a6d-2f28-4370-8fc4-977debbc24f4","Type":"ContainerStarted","Data":"90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0"} Apr 16 18:43:18.257935 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:18.257702 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" Apr 16 18:43:18.274697 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:18.274650 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" podStartSLOduration=6.274636595 podStartE2EDuration="6.274636595s" podCreationTimestamp="2026-04-16 18:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:43:18.273122974 +0000 UTC m=+2393.216736030" watchObservedRunningTime="2026-04-16 18:43:18.274636595 +0000 UTC m=+2393.218249651" Apr 16 18:43:49.263214 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:49.263158 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 18:43:59.261570 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:43:59.261523 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 18:44:09.261961 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:09.261909 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 18:44:19.261592 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:19.261547 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 18:44:29.262183 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:29.262139 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 18:44:35.649789 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:35.649752 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" Apr 16 18:44:42.126077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:42.126040 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt"] Apr 16 18:44:42.126502 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:42.126352 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" containerID="cri-o://90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0" gracePeriod=30 Apr 16 18:44:44.288072 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.288036 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx"] Apr 16 18:44:44.288491 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.288417 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" Apr 16 18:44:44.288491 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.288431 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" Apr 16 18:44:44.288491 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.288446 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="storage-initializer" Apr 16 18:44:44.288491 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.288452 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="storage-initializer" Apr 16 18:44:44.288645 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.288507 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ddf4374b-a0e6-42d6-911e-f6c9b3e391a1" containerName="kserve-container" Apr 16 18:44:44.291450 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.291434 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" Apr 16 18:44:44.298894 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.298870 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx"] Apr 16 18:44:44.306181 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.306159 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107728d2-cd5a-44f6-b4b0-5a6d1b684d2c-kserve-provision-location\") pod \"isvc-sklearn-predictor-7b5b777f74-j56tx\" (UID: \"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" Apr 16 18:44:44.406641 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.406608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107728d2-cd5a-44f6-b4b0-5a6d1b684d2c-kserve-provision-location\") pod \"isvc-sklearn-predictor-7b5b777f74-j56tx\" (UID: \"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" Apr 16 18:44:44.406947 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.406929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107728d2-cd5a-44f6-b4b0-5a6d1b684d2c-kserve-provision-location\") pod \"isvc-sklearn-predictor-7b5b777f74-j56tx\" (UID: \"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" Apr 16 18:44:44.601993 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.601906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" Apr 16 18:44:44.722452 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:44.722360 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx"] Apr 16 18:44:44.725201 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:44:44.725156 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod107728d2_cd5a_44f6_b4b0_5a6d1b684d2c.slice/crio-82841fcf52fd9bd46427fc013955b9ff3691b054c3e4db36fa43e9ae568a5215 WatchSource:0}: Error finding container 82841fcf52fd9bd46427fc013955b9ff3691b054c3e4db36fa43e9ae568a5215: Status 404 returned error can't find the container with id 82841fcf52fd9bd46427fc013955b9ff3691b054c3e4db36fa43e9ae568a5215 Apr 16 18:44:45.524080 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:45.524045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" event={"ID":"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c","Type":"ContainerStarted","Data":"b6ec6e3cf0857297410e93ac38d0d61906d3b3b028aedadfce1fd0e26e4860f1"} Apr 16 18:44:45.524080 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:45.524083 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" event={"ID":"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c","Type":"ContainerStarted","Data":"82841fcf52fd9bd46427fc013955b9ff3691b054c3e4db36fa43e9ae568a5215"} Apr 16 18:44:45.646086 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:45.646045 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 16 18:44:47.457786 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.457764 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" Apr 16 18:44:47.529726 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.529699 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b4d9a6d-2f28-4370-8fc4-977debbc24f4-kserve-provision-location\") pod \"7b4d9a6d-2f28-4370-8fc4-977debbc24f4\" (UID: \"7b4d9a6d-2f28-4370-8fc4-977debbc24f4\") " Apr 16 18:44:47.530064 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.530044 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b4d9a6d-2f28-4370-8fc4-977debbc24f4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7b4d9a6d-2f28-4370-8fc4-977debbc24f4" (UID: "7b4d9a6d-2f28-4370-8fc4-977debbc24f4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:47.530694 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.530674 2578 generic.go:358] "Generic (PLEG): container finished" podID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerID="90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0" exitCode=0 Apr 16 18:44:47.530762 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.530741 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" Apr 16 18:44:47.530803 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.530753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" event={"ID":"7b4d9a6d-2f28-4370-8fc4-977debbc24f4","Type":"ContainerDied","Data":"90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0"} Apr 16 18:44:47.530803 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.530792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt" event={"ID":"7b4d9a6d-2f28-4370-8fc4-977debbc24f4","Type":"ContainerDied","Data":"1ab1283999c4056e3b1657b579311beda8ba30d4a1e5403b2b2be0cb027e5c1b"} Apr 16 18:44:47.530878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.530808 2578 scope.go:117] "RemoveContainer" containerID="90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0" Apr 16 18:44:47.539213 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.539183 2578 scope.go:117] "RemoveContainer" containerID="743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0" Apr 16 18:44:47.545841 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.545825 2578 scope.go:117] "RemoveContainer" containerID="90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0" Apr 16 18:44:47.546080 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:44:47.546061 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0\": container with ID starting with 90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0 not found: ID does not exist" containerID="90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0" Apr 16 18:44:47.546122 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.546088 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0"} err="failed to get container status \"90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0\": rpc error: code = NotFound desc = could not find container \"90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0\": container with ID starting with 90910401349cc073419ee383122bcca58d4b3c2c9bcfcb3cb1b444232add3de0 not found: ID does not exist" Apr 16 18:44:47.546122 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.546108 2578 scope.go:117] "RemoveContainer" containerID="743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0" Apr 16 18:44:47.546341 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:44:47.546324 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0\": container with ID starting with 743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0 not found: ID does not exist" containerID="743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0" Apr 16 18:44:47.546383 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.546346 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0"} err="failed to get container status \"743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0\": rpc error: code = NotFound desc = could not find container \"743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0\": container with ID starting with 743a6536c241ef0c0d7031d441a8f3f0894171aeb1bd5b8738790fb14f1494e0 not found: ID does not exist" Apr 16 18:44:47.551089 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.551069 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt"] Apr 16 18:44:47.556121 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.556101 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-m5lwt"] Apr 16 18:44:47.630675 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.630595 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b4d9a6d-2f28-4370-8fc4-977debbc24f4-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:44:47.647747 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:47.647709 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" path="/var/lib/kubelet/pods/7b4d9a6d-2f28-4370-8fc4-977debbc24f4/volumes" Apr 16 18:44:48.535970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:48.535943 2578 generic.go:358] "Generic (PLEG): container finished" podID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerID="b6ec6e3cf0857297410e93ac38d0d61906d3b3b028aedadfce1fd0e26e4860f1" exitCode=0 Apr 16 18:44:48.536311 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:48.536006 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" event={"ID":"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c","Type":"ContainerDied","Data":"b6ec6e3cf0857297410e93ac38d0d61906d3b3b028aedadfce1fd0e26e4860f1"} Apr 16 18:44:49.540717 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:49.540685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" event={"ID":"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c","Type":"ContainerStarted","Data":"95efd61ea50270273237596c7cf428d2cecece63771ac7ff09398b1776b40fa3"} Apr 16 18:44:49.541116 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:49.540984 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" Apr 16 18:44:49.542448 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:49.542422 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 18:44:49.557381 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:49.557325 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podStartSLOduration=5.5573082639999996 podStartE2EDuration="5.557308264s" podCreationTimestamp="2026-04-16 18:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:49.556443702 +0000 UTC m=+2484.500056771" watchObservedRunningTime="2026-04-16 18:44:49.557308264 +0000 UTC m=+2484.500921320" Apr 16 18:44:50.543934 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:44:50.543898 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 18:45:00.544368 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:45:00.544324 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 18:45:10.544704 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:45:10.544602 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 18:45:20.543926 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:45:20.543871 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 18:45:30.544873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:45:30.544827 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 18:45:40.544763 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:45:40.544719 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 18:45:50.544777 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:45:50.544727 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 18:45:52.644816 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:45:52.644771 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.49:8080: connect: connection refused" Apr 16 18:46:02.646379 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:02.646349 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" Apr 16 18:46:04.424080 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.424050 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx"] Apr 16 18:46:04.424573 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.424308 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" containerID="cri-o://95efd61ea50270273237596c7cf428d2cecece63771ac7ff09398b1776b40fa3" gracePeriod=30 Apr 16 18:46:04.480362 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.480326 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz"] Apr 16 18:46:04.480669 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.480656 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="storage-initializer" Apr 16 18:46:04.480726 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.480673 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="storage-initializer" Apr 16 18:46:04.480726 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.480689 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" Apr 16 18:46:04.480726 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.480694 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" Apr 16 18:46:04.480834 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.480759 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b4d9a6d-2f28-4370-8fc4-977debbc24f4" containerName="kserve-container" Apr 16 18:46:04.483891 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.483876 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" Apr 16 18:46:04.493390 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.493365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz"] Apr 16 18:46:04.584474 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.584443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43120478-f4e4-454d-9537-10eedbfb68c9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-27hrz\" (UID: \"43120478-f4e4-454d-9537-10eedbfb68c9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" Apr 16 18:46:04.685623 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.685530 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43120478-f4e4-454d-9537-10eedbfb68c9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-27hrz\" (UID: \"43120478-f4e4-454d-9537-10eedbfb68c9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" Apr 16 18:46:04.685915 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.685894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43120478-f4e4-454d-9537-10eedbfb68c9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-27hrz\" (UID: \"43120478-f4e4-454d-9537-10eedbfb68c9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" Apr 16 18:46:04.794071 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.794042 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" Apr 16 18:46:04.916810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.916777 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz"] Apr 16 18:46:04.919851 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:46:04.919821 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43120478_f4e4_454d_9537_10eedbfb68c9.slice/crio-0acf11e84de85db484dfda508318e0d2e8c0c6a0a14b3bc060ab89c46386825b WatchSource:0}: Error finding container 0acf11e84de85db484dfda508318e0d2e8c0c6a0a14b3bc060ab89c46386825b: Status 404 returned error can't find the container with id 0acf11e84de85db484dfda508318e0d2e8c0c6a0a14b3bc060ab89c46386825b Apr 16 18:46:04.921588 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:04.921571 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:46:05.766793 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:05.766758 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" event={"ID":"43120478-f4e4-454d-9537-10eedbfb68c9","Type":"ContainerStarted","Data":"4792cc969644aa27182372a765e5bb611c6faad033ee6c87b6d852a2ba6ac370"} Apr 16 18:46:05.766793 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:05.766792 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" event={"ID":"43120478-f4e4-454d-9537-10eedbfb68c9","Type":"ContainerStarted","Data":"0acf11e84de85db484dfda508318e0d2e8c0c6a0a14b3bc060ab89c46386825b"} Apr 16 18:46:08.781657 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:08.781610 2578 generic.go:358] "Generic (PLEG): container finished" podID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerID="95efd61ea50270273237596c7cf428d2cecece63771ac7ff09398b1776b40fa3" exitCode=0 Apr 16 18:46:08.781657 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:08.781636 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" event={"ID":"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c","Type":"ContainerDied","Data":"95efd61ea50270273237596c7cf428d2cecece63771ac7ff09398b1776b40fa3"} Apr 16 18:46:08.893722 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:08.893664 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" Apr 16 18:46:09.026808 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.026772 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107728d2-cd5a-44f6-b4b0-5a6d1b684d2c-kserve-provision-location\") pod \"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c\" (UID: \"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c\") " Apr 16 18:46:09.027137 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.027113 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107728d2-cd5a-44f6-b4b0-5a6d1b684d2c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" (UID: "107728d2-cd5a-44f6-b4b0-5a6d1b684d2c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:46:09.128285 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.128240 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/107728d2-cd5a-44f6-b4b0-5a6d1b684d2c-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:46:09.786865 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.786840 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" Apr 16 18:46:09.786865 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.786846 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx" event={"ID":"107728d2-cd5a-44f6-b4b0-5a6d1b684d2c","Type":"ContainerDied","Data":"82841fcf52fd9bd46427fc013955b9ff3691b054c3e4db36fa43e9ae568a5215"} Apr 16 18:46:09.787418 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.786892 2578 scope.go:117] "RemoveContainer" containerID="95efd61ea50270273237596c7cf428d2cecece63771ac7ff09398b1776b40fa3" Apr 16 18:46:09.788316 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.788295 2578 generic.go:358] "Generic (PLEG): container finished" podID="43120478-f4e4-454d-9537-10eedbfb68c9" containerID="4792cc969644aa27182372a765e5bb611c6faad033ee6c87b6d852a2ba6ac370" exitCode=0 Apr 16 18:46:09.788416 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.788361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" event={"ID":"43120478-f4e4-454d-9537-10eedbfb68c9","Type":"ContainerDied","Data":"4792cc969644aa27182372a765e5bb611c6faad033ee6c87b6d852a2ba6ac370"} Apr 16 18:46:09.795394 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.795309 2578 scope.go:117] "RemoveContainer" containerID="b6ec6e3cf0857297410e93ac38d0d61906d3b3b028aedadfce1fd0e26e4860f1" Apr 16 18:46:09.801033 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.801013 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx"] Apr 16 18:46:09.804575 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:09.804556 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7b5b777f74-j56tx"] Apr 16 18:46:10.793505 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:10.793468 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" event={"ID":"43120478-f4e4-454d-9537-10eedbfb68c9","Type":"ContainerStarted","Data":"8e014624a11e22f0d95a3488c06d4239bbf1657f8e71fbd9ce4f6fff8782cd7f"} Apr 16 18:46:10.793947 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:10.793702 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" Apr 16 18:46:10.812107 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:10.812060 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" podStartSLOduration=6.812046841 podStartE2EDuration="6.812046841s" podCreationTimestamp="2026-04-16 18:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:46:10.809855374 +0000 UTC m=+2565.753468431" watchObservedRunningTime="2026-04-16 18:46:10.812046841 +0000 UTC m=+2565.755659897" Apr 16 18:46:11.648342 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:11.648309 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" path="/var/lib/kubelet/pods/107728d2-cd5a-44f6-b4b0-5a6d1b684d2c/volumes" Apr 16 18:46:41.841202 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:41.841102 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" podUID="43120478-f4e4-454d-9537-10eedbfb68c9" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 18:46:51.799367 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:51.799340 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" Apr 16 18:46:54.593359 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.593324 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz"] Apr 16 18:46:54.593814 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.593674 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" podUID="43120478-f4e4-454d-9537-10eedbfb68c9" containerName="kserve-container" containerID="cri-o://8e014624a11e22f0d95a3488c06d4239bbf1657f8e71fbd9ce4f6fff8782cd7f" gracePeriod=30 Apr 16 18:46:54.640083 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.640044 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw"] Apr 16 18:46:54.640470 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.640429 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="storage-initializer" Apr 16 18:46:54.640470 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.640446 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="storage-initializer" Apr 16 18:46:54.640470 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.640455 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" Apr 16 18:46:54.640470 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.640461 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" Apr 16 18:46:54.640725 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.640511 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="107728d2-cd5a-44f6-b4b0-5a6d1b684d2c" containerName="kserve-container" Apr 16 18:46:54.643245 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.643223 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" Apr 16 18:46:54.651171 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.650740 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw"] Apr 16 18:46:54.700427 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.700399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw\" (UID: \"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" Apr 16 18:46:54.801354 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.801318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw\" (UID: \"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" Apr 16 18:46:54.801662 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.801646 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw\" (UID: \"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" Apr 16 18:46:54.954811 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:54.954781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" Apr 16 18:46:55.072113 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:55.072051 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw"] Apr 16 18:46:55.074918 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:46:55.074891 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode50f355c_a2c8_4d1f_b7e5_e5b8fb9b8f1b.slice/crio-4687a1d225c06eb33ee1a2c63dd71f6dafb356ec42af9311a5907cc76b25f642 WatchSource:0}: Error finding container 4687a1d225c06eb33ee1a2c63dd71f6dafb356ec42af9311a5907cc76b25f642: Status 404 returned error can't find the container with id 4687a1d225c06eb33ee1a2c63dd71f6dafb356ec42af9311a5907cc76b25f642 Apr 16 18:46:55.935972 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:55.935937 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" event={"ID":"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b","Type":"ContainerStarted","Data":"4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f"} Apr 16 18:46:55.935972 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:46:55.935971 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" event={"ID":"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b","Type":"ContainerStarted","Data":"4687a1d225c06eb33ee1a2c63dd71f6dafb356ec42af9311a5907cc76b25f642"} Apr 16 18:47:00.953822 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:00.953787 2578 generic.go:358] "Generic (PLEG): container finished" podID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerID="4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f" exitCode=0 Apr 16 18:47:00.954323 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:00.953843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" event={"ID":"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b","Type":"ContainerDied","Data":"4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f"} Apr 16 18:47:01.797998 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:01.797955 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" podUID="43120478-f4e4-454d-9537-10eedbfb68c9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.50:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.134.0.50:8080: connect: connection refused" Apr 16 18:47:01.959646 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:01.959614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" event={"ID":"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b","Type":"ContainerStarted","Data":"c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd"} Apr 16 18:47:01.960066 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:01.960047 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" Apr 16 18:47:01.962004 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:01.961697 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 18:47:01.963211 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:01.963164 2578 generic.go:358] "Generic (PLEG): container finished" podID="43120478-f4e4-454d-9537-10eedbfb68c9" containerID="8e014624a11e22f0d95a3488c06d4239bbf1657f8e71fbd9ce4f6fff8782cd7f" exitCode=0 Apr 16 18:47:01.963323 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:01.963235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" event={"ID":"43120478-f4e4-454d-9537-10eedbfb68c9","Type":"ContainerDied","Data":"8e014624a11e22f0d95a3488c06d4239bbf1657f8e71fbd9ce4f6fff8782cd7f"} Apr 16 18:47:01.977570 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:01.977515 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" podStartSLOduration=7.977496418 podStartE2EDuration="7.977496418s" podCreationTimestamp="2026-04-16 18:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:47:01.974984408 +0000 UTC m=+2616.918597477" watchObservedRunningTime="2026-04-16 18:47:01.977496418 +0000 UTC m=+2616.921109475" Apr 16 18:47:02.039789 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.039765 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" Apr 16 18:47:02.169294 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.169184 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43120478-f4e4-454d-9537-10eedbfb68c9-kserve-provision-location\") pod \"43120478-f4e4-454d-9537-10eedbfb68c9\" (UID: \"43120478-f4e4-454d-9537-10eedbfb68c9\") " Apr 16 18:47:02.169449 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.169327 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43120478-f4e4-454d-9537-10eedbfb68c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "43120478-f4e4-454d-9537-10eedbfb68c9" (UID: "43120478-f4e4-454d-9537-10eedbfb68c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:02.169505 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.169456 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43120478-f4e4-454d-9537-10eedbfb68c9-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:47:02.967905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.967878 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" Apr 16 18:47:02.967905 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.967883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz" event={"ID":"43120478-f4e4-454d-9537-10eedbfb68c9","Type":"ContainerDied","Data":"0acf11e84de85db484dfda508318e0d2e8c0c6a0a14b3bc060ab89c46386825b"} Apr 16 18:47:02.968461 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.967933 2578 scope.go:117] "RemoveContainer" containerID="8e014624a11e22f0d95a3488c06d4239bbf1657f8e71fbd9ce4f6fff8782cd7f" Apr 16 18:47:02.968461 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.968383 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 18:47:02.976896 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.976875 2578 scope.go:117] "RemoveContainer" containerID="4792cc969644aa27182372a765e5bb611c6faad033ee6c87b6d852a2ba6ac370" Apr 16 18:47:02.988138 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.988113 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz"] Apr 16 18:47:02.990244 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:02.990227 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-27hrz"] Apr 16 18:47:03.647759 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:03.647730 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43120478-f4e4-454d-9537-10eedbfb68c9" path="/var/lib/kubelet/pods/43120478-f4e4-454d-9537-10eedbfb68c9/volumes" Apr 16 18:47:12.968990 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:12.968938 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.51:8080: connect: connection refused" Apr 16 18:47:22.969993 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:22.969964 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" Apr 16 18:47:31.676494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.676458 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw_e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b/kserve-container/0.log" Apr 16 18:47:31.829437 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.829403 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw"] Apr 16 18:47:31.829681 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.829660 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerName="kserve-container" containerID="cri-o://c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd" gracePeriod=30 Apr 16 18:47:31.868154 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.868124 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl"] Apr 16 18:47:31.868537 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.868521 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43120478-f4e4-454d-9537-10eedbfb68c9" containerName="storage-initializer" Apr 16 18:47:31.868582 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.868539 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="43120478-f4e4-454d-9537-10eedbfb68c9" containerName="storage-initializer" Apr 16 18:47:31.868582 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.868551 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43120478-f4e4-454d-9537-10eedbfb68c9" containerName="kserve-container" Apr 16 18:47:31.868582 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.868556 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="43120478-f4e4-454d-9537-10eedbfb68c9" containerName="kserve-container" Apr 16 18:47:31.868682 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.868618 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="43120478-f4e4-454d-9537-10eedbfb68c9" containerName="kserve-container" Apr 16 18:47:31.873010 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.872993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" Apr 16 18:47:31.878584 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.878558 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl"] Apr 16 18:47:31.914616 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:31.914584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e456529-5b72-4934-94d7-5a66967a888c-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl\" (UID: \"7e456529-5b72-4934-94d7-5a66967a888c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" Apr 16 18:47:32.015214 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:32.015157 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e456529-5b72-4934-94d7-5a66967a888c-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl\" (UID: \"7e456529-5b72-4934-94d7-5a66967a888c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" Apr 16 18:47:32.015514 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:32.015494 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e456529-5b72-4934-94d7-5a66967a888c-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl\" (UID: \"7e456529-5b72-4934-94d7-5a66967a888c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" Apr 16 18:47:32.183817 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:32.183785 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" Apr 16 18:47:32.319115 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:32.319078 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl"] Apr 16 18:47:32.321771 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:47:32.321743 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e456529_5b72_4934_94d7_5a66967a888c.slice/crio-ac1abba0c259a497221276d93c6363664bd78c75d868f0268612346c7cbcde7b WatchSource:0}: Error finding container ac1abba0c259a497221276d93c6363664bd78c75d868f0268612346c7cbcde7b: Status 404 returned error can't find the container with id ac1abba0c259a497221276d93c6363664bd78c75d868f0268612346c7cbcde7b Apr 16 18:47:32.950856 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:32.950834 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" Apr 16 18:47:33.023984 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.023946 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b-kserve-provision-location\") pod \"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b\" (UID: \"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b\") " Apr 16 18:47:33.048239 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.048151 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" (UID: "e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:33.063260 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.063222 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" event={"ID":"7e456529-5b72-4934-94d7-5a66967a888c","Type":"ContainerStarted","Data":"2ce68a9cb8708573c4701d7677d3bfce42e1d6d3695ae1a1095d46c797ee5318"} Apr 16 18:47:33.063418 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.063268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" event={"ID":"7e456529-5b72-4934-94d7-5a66967a888c","Type":"ContainerStarted","Data":"ac1abba0c259a497221276d93c6363664bd78c75d868f0268612346c7cbcde7b"} Apr 16 18:47:33.064577 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.064551 2578 generic.go:358] "Generic (PLEG): container finished" podID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerID="c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd" exitCode=0 Apr 16 18:47:33.064686 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.064589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" event={"ID":"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b","Type":"ContainerDied","Data":"c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd"} Apr 16 18:47:33.064686 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.064610 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" Apr 16 18:47:33.064686 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.064612 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw" event={"ID":"e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b","Type":"ContainerDied","Data":"4687a1d225c06eb33ee1a2c63dd71f6dafb356ec42af9311a5907cc76b25f642"} Apr 16 18:47:33.064686 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.064633 2578 scope.go:117] "RemoveContainer" containerID="c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd" Apr 16 18:47:33.072932 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.072917 2578 scope.go:117] "RemoveContainer" containerID="4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f" Apr 16 18:47:33.079685 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.079670 2578 scope.go:117] "RemoveContainer" containerID="c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd" Apr 16 18:47:33.079941 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:47:33.079922 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd\": container with ID starting with c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd not found: ID does not exist" containerID="c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd" Apr 16 18:47:33.080033 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.079948 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd"} err="failed to get container status \"c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd\": rpc error: code = NotFound desc = could not find container \"c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd\": container with ID starting with c12c3989ec9f2c722124f46a3a7736a511db4d21f95ebf14858a5985a25becbd not found: ID does not exist" Apr 16 18:47:33.080033 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.079964 2578 scope.go:117] "RemoveContainer" containerID="4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f" Apr 16 18:47:33.080221 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:47:33.080181 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f\": container with ID starting with 4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f not found: ID does not exist" containerID="4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f" Apr 16 18:47:33.080295 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.080232 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f"} err="failed to get container status \"4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f\": rpc error: code = NotFound desc = could not find container \"4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f\": container with ID starting with 4634cadd9f20310acf21d40d718437e27f943a3e7a3782e1ccd83df051513a9f not found: ID does not exist" Apr 16 18:47:33.094433 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.094405 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw"] Apr 16 18:47:33.095987 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.095963 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-5c7bb4b848-6szmw"] Apr 16 18:47:33.124798 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.124773 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:47:33.648310 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:33.648270 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" path="/var/lib/kubelet/pods/e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b/volumes" Apr 16 18:47:36.075285 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:36.075250 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e456529-5b72-4934-94d7-5a66967a888c" containerID="2ce68a9cb8708573c4701d7677d3bfce42e1d6d3695ae1a1095d46c797ee5318" exitCode=0 Apr 16 18:47:36.075285 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:36.075280 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" event={"ID":"7e456529-5b72-4934-94d7-5a66967a888c","Type":"ContainerDied","Data":"2ce68a9cb8708573c4701d7677d3bfce42e1d6d3695ae1a1095d46c797ee5318"} Apr 16 18:47:37.080376 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:37.080339 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" event={"ID":"7e456529-5b72-4934-94d7-5a66967a888c","Type":"ContainerStarted","Data":"2693c20bae5130ad6032c613f68f14d2331b20fd7ee71a9a285340c8b07202cc"} Apr 16 18:47:37.080841 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:37.080574 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" Apr 16 18:47:37.098542 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:47:37.098342 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" podStartSLOduration=6.09832274 podStartE2EDuration="6.09832274s" podCreationTimestamp="2026-04-16 18:47:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:47:37.097475989 +0000 UTC m=+2652.041089045" watchObservedRunningTime="2026-04-16 18:47:37.09832274 +0000 UTC m=+2652.041935795" Apr 16 18:48:08.141314 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:08.141212 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" podUID="7e456529-5b72-4934-94d7-5a66967a888c" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 18:48:18.085328 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:18.085298 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" Apr 16 18:48:21.969733 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:21.969697 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl"] Apr 16 18:48:21.970220 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:21.969969 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" podUID="7e456529-5b72-4934-94d7-5a66967a888c" containerName="kserve-container" containerID="cri-o://2693c20bae5130ad6032c613f68f14d2331b20fd7ee71a9a285340c8b07202cc" gracePeriod=30 Apr 16 18:48:22.029046 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.029008 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82"] Apr 16 18:48:22.029358 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.029345 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerName="storage-initializer" Apr 16 18:48:22.029414 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.029359 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerName="storage-initializer" Apr 16 18:48:22.029414 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.029378 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerName="kserve-container" Apr 16 18:48:22.029414 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.029384 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerName="kserve-container" Apr 16 18:48:22.029529 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.029451 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e50f355c-a2c8-4d1f-b7e5-e5b8fb9b8f1b" containerName="kserve-container" Apr 16 18:48:22.032772 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.032755 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" Apr 16 18:48:22.043030 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.043004 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7bc9649f6b-rrh82\" (UID: \"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" Apr 16 18:48:22.043227 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.043205 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82"] Apr 16 18:48:22.144046 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.143993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7bc9649f6b-rrh82\" (UID: \"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" Apr 16 18:48:22.144384 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.144363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-7bc9649f6b-rrh82\" (UID: \"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" Apr 16 18:48:22.343679 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.343583 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" Apr 16 18:48:22.464828 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:22.464747 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82"] Apr 16 18:48:22.467643 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:48:22.467615 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee4fa09f_fd22_46d2_a4c3_0600aafe0fa9.slice/crio-9e3a3046c22904110c481141e505cce773aa9e17977b0d41b377e2337d59fbca WatchSource:0}: Error finding container 9e3a3046c22904110c481141e505cce773aa9e17977b0d41b377e2337d59fbca: Status 404 returned error can't find the container with id 9e3a3046c22904110c481141e505cce773aa9e17977b0d41b377e2337d59fbca Apr 16 18:48:23.220427 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:23.220391 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" event={"ID":"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9","Type":"ContainerStarted","Data":"126114b5d24f113a953def646d7202dd4a405338ed58bd57f8c569b879aa8544"} Apr 16 18:48:23.220427 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:23.220432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" event={"ID":"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9","Type":"ContainerStarted","Data":"9e3a3046c22904110c481141e505cce773aa9e17977b0d41b377e2337d59fbca"} Apr 16 18:48:27.233836 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:27.233803 2578 generic.go:358] "Generic (PLEG): container finished" podID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerID="126114b5d24f113a953def646d7202dd4a405338ed58bd57f8c569b879aa8544" exitCode=0 Apr 16 18:48:27.234233 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:27.233876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" event={"ID":"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9","Type":"ContainerDied","Data":"126114b5d24f113a953def646d7202dd4a405338ed58bd57f8c569b879aa8544"} Apr 16 18:48:28.083701 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:28.083661 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" podUID="7e456529-5b72-4934-94d7-5a66967a888c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.52:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.134.0.52:8080: connect: connection refused" Apr 16 18:48:28.238668 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:28.238638 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" event={"ID":"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9","Type":"ContainerStarted","Data":"bed4de655f111075f049041137de8a48824b524f67eb9aa83135589230964d40"} Apr 16 18:48:28.239081 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:28.239001 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" Apr 16 18:48:28.240409 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:28.240385 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 18:48:28.271450 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:28.271405 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podStartSLOduration=6.271387336 podStartE2EDuration="6.271387336s" podCreationTimestamp="2026-04-16 18:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:48:28.270818309 +0000 UTC m=+2703.214431365" watchObservedRunningTime="2026-04-16 18:48:28.271387336 +0000 UTC m=+2703.215000394" Apr 16 18:48:29.243926 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:29.243892 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e456529-5b72-4934-94d7-5a66967a888c" containerID="2693c20bae5130ad6032c613f68f14d2331b20fd7ee71a9a285340c8b07202cc" exitCode=0 Apr 16 18:48:29.244283 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:29.243960 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" event={"ID":"7e456529-5b72-4934-94d7-5a66967a888c","Type":"ContainerDied","Data":"2693c20bae5130ad6032c613f68f14d2331b20fd7ee71a9a285340c8b07202cc"} Apr 16 18:48:29.244637 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:29.244612 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 18:48:29.315423 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:29.315400 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" Apr 16 18:48:29.403454 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:29.403376 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e456529-5b72-4934-94d7-5a66967a888c-kserve-provision-location\") pod \"7e456529-5b72-4934-94d7-5a66967a888c\" (UID: \"7e456529-5b72-4934-94d7-5a66967a888c\") " Apr 16 18:48:29.403720 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:29.403700 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e456529-5b72-4934-94d7-5a66967a888c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e456529-5b72-4934-94d7-5a66967a888c" (UID: "7e456529-5b72-4934-94d7-5a66967a888c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:48:29.504016 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:29.503985 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e456529-5b72-4934-94d7-5a66967a888c-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:48:30.248597 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:30.248562 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" event={"ID":"7e456529-5b72-4934-94d7-5a66967a888c","Type":"ContainerDied","Data":"ac1abba0c259a497221276d93c6363664bd78c75d868f0268612346c7cbcde7b"} Apr 16 18:48:30.249027 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:30.248612 2578 scope.go:117] "RemoveContainer" containerID="2693c20bae5130ad6032c613f68f14d2331b20fd7ee71a9a285340c8b07202cc" Apr 16 18:48:30.249027 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:30.248616 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl" Apr 16 18:48:30.256457 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:30.256434 2578 scope.go:117] "RemoveContainer" containerID="2ce68a9cb8708573c4701d7677d3bfce42e1d6d3695ae1a1095d46c797ee5318" Apr 16 18:48:30.264634 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:30.264612 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl"] Apr 16 18:48:30.269834 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:30.269815 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-4vntl"] Apr 16 18:48:31.647854 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:31.647825 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e456529-5b72-4934-94d7-5a66967a888c" path="/var/lib/kubelet/pods/7e456529-5b72-4934-94d7-5a66967a888c/volumes" Apr 16 18:48:39.245479 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:39.245431 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 18:48:49.245623 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:49.245570 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 18:48:59.245057 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:48:59.245013 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 18:49:09.244946 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:09.244902 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 18:49:19.245336 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:19.245292 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 18:49:29.244782 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:29.244740 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.53:8080: connect: connection refused" Apr 16 18:49:39.246290 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:39.246204 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" Apr 16 18:49:42.260975 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.260936 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82"] Apr 16 18:49:42.261458 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.261261 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" containerID="cri-o://bed4de655f111075f049041137de8a48824b524f67eb9aa83135589230964d40" gracePeriod=30 Apr 16 18:49:42.332262 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.332229 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w"] Apr 16 18:49:42.332567 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.332555 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e456529-5b72-4934-94d7-5a66967a888c" containerName="storage-initializer" Apr 16 18:49:42.332615 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.332569 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e456529-5b72-4934-94d7-5a66967a888c" containerName="storage-initializer" Apr 16 18:49:42.332615 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.332591 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e456529-5b72-4934-94d7-5a66967a888c" containerName="kserve-container" Apr 16 18:49:42.332615 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.332596 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e456529-5b72-4934-94d7-5a66967a888c" containerName="kserve-container" Apr 16 18:49:42.337319 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.333263 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e456529-5b72-4934-94d7-5a66967a888c" containerName="kserve-container" Apr 16 18:49:42.338828 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.338803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" Apr 16 18:49:42.344743 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.344716 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w"] Apr 16 18:49:42.425597 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.425552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w\" (UID: \"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" Apr 16 18:49:42.526975 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.526887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w\" (UID: \"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" Apr 16 18:49:42.527254 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.527237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w\" (UID: \"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" Apr 16 18:49:42.651309 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.651262 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" Apr 16 18:49:42.773995 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:42.773854 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w"] Apr 16 18:49:42.776769 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:49:42.776746 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07fdec94_8e25_44cc_a7c8_5e0fb3fb8b92.slice/crio-165c9e3958a13efc51b8c7e06d57d23021d4334e827cb9dc99a5e70daef45ea4 WatchSource:0}: Error finding container 165c9e3958a13efc51b8c7e06d57d23021d4334e827cb9dc99a5e70daef45ea4: Status 404 returned error can't find the container with id 165c9e3958a13efc51b8c7e06d57d23021d4334e827cb9dc99a5e70daef45ea4 Apr 16 18:49:43.465338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:43.465300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" event={"ID":"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92","Type":"ContainerStarted","Data":"addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c"} Apr 16 18:49:43.465716 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:43.465345 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" event={"ID":"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92","Type":"ContainerStarted","Data":"165c9e3958a13efc51b8c7e06d57d23021d4334e827cb9dc99a5e70daef45ea4"} Apr 16 18:49:46.476475 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:46.476445 2578 generic.go:358] "Generic (PLEG): container finished" podID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerID="bed4de655f111075f049041137de8a48824b524f67eb9aa83135589230964d40" exitCode=0 Apr 16 18:49:46.476786 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:46.476494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" event={"ID":"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9","Type":"ContainerDied","Data":"bed4de655f111075f049041137de8a48824b524f67eb9aa83135589230964d40"} Apr 16 18:49:46.595782 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:46.595759 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" Apr 16 18:49:46.659041 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:46.659009 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9-kserve-provision-location\") pod \"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9\" (UID: \"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9\") " Apr 16 18:49:46.659368 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:46.659349 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" (UID: "ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:46.760124 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:46.760084 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:49:47.480712 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:47.480678 2578 generic.go:358] "Generic (PLEG): container finished" podID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerID="addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c" exitCode=0 Apr 16 18:49:47.481217 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:47.480755 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" event={"ID":"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92","Type":"ContainerDied","Data":"addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c"} Apr 16 18:49:47.482185 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:47.482162 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" event={"ID":"ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9","Type":"ContainerDied","Data":"9e3a3046c22904110c481141e505cce773aa9e17977b0d41b377e2337d59fbca"} Apr 16 18:49:47.482291 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:47.482213 2578 scope.go:117] "RemoveContainer" containerID="bed4de655f111075f049041137de8a48824b524f67eb9aa83135589230964d40" Apr 16 18:49:47.482291 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:47.482237 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82" Apr 16 18:49:47.492262 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:47.491971 2578 scope.go:117] "RemoveContainer" containerID="126114b5d24f113a953def646d7202dd4a405338ed58bd57f8c569b879aa8544" Apr 16 18:49:47.506671 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:47.506649 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82"] Apr 16 18:49:47.510448 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:47.510427 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-7bc9649f6b-rrh82"] Apr 16 18:49:47.647943 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:47.647907 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" path="/var/lib/kubelet/pods/ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9/volumes" Apr 16 18:49:48.486484 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:48.486451 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" event={"ID":"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92","Type":"ContainerStarted","Data":"7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9"} Apr 16 18:49:48.486934 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:48.486745 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" Apr 16 18:49:48.487882 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:48.487855 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 18:49:48.500747 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:48.500644 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podStartSLOduration=6.500631989 podStartE2EDuration="6.500631989s" podCreationTimestamp="2026-04-16 18:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:48.50048932 +0000 UTC m=+2783.444102373" watchObservedRunningTime="2026-04-16 18:49:48.500631989 +0000 UTC m=+2783.444245044" Apr 16 18:49:49.490843 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:49.490809 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 18:49:59.491814 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:49:59.491773 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 18:50:09.491617 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:50:09.491574 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 18:50:19.491001 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:50:19.490960 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 18:50:29.491002 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:50:29.490958 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 18:50:39.491584 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:50:39.491537 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 18:50:49.491480 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:50:49.491440 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 16 18:50:59.492382 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:50:59.492354 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" Apr 16 18:51:02.460538 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.460500 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w"] Apr 16 18:51:02.460932 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.460851 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" containerID="cri-o://7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9" gracePeriod=30 Apr 16 18:51:02.509494 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.509460 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw"] Apr 16 18:51:02.509812 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.509799 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="storage-initializer" Apr 16 18:51:02.509858 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.509814 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="storage-initializer" Apr 16 18:51:02.509858 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.509823 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" Apr 16 18:51:02.509858 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.509829 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" Apr 16 18:51:02.509965 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.509893 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee4fa09f-fd22-46d2-a4c3-0600aafe0fa9" containerName="kserve-container" Apr 16 18:51:02.512818 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.512799 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" Apr 16 18:51:02.522370 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.522339 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw"] Apr 16 18:51:02.704755 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.704707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9de2403a-5164-4b17-8dfc-6cd3c579c805-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-d9lmw\" (UID: \"9de2403a-5164-4b17-8dfc-6cd3c579c805\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" Apr 16 18:51:02.805564 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.805468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9de2403a-5164-4b17-8dfc-6cd3c579c805-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-d9lmw\" (UID: \"9de2403a-5164-4b17-8dfc-6cd3c579c805\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" Apr 16 18:51:02.805918 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.805883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9de2403a-5164-4b17-8dfc-6cd3c579c805-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-d9lmw\" (UID: \"9de2403a-5164-4b17-8dfc-6cd3c579c805\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" Apr 16 18:51:02.822948 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.822923 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" Apr 16 18:51:02.958281 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:02.958250 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw"] Apr 16 18:51:02.961910 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:51:02.961866 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de2403a_5164_4b17_8dfc_6cd3c579c805.slice/crio-9a2b516d0f53086c70f11aaa036fd688caf731bab848365f9f7ab9fbc7d68c74 WatchSource:0}: Error finding container 9a2b516d0f53086c70f11aaa036fd688caf731bab848365f9f7ab9fbc7d68c74: Status 404 returned error can't find the container with id 9a2b516d0f53086c70f11aaa036fd688caf731bab848365f9f7ab9fbc7d68c74 Apr 16 18:51:03.720647 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:03.720615 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" event={"ID":"9de2403a-5164-4b17-8dfc-6cd3c579c805","Type":"ContainerStarted","Data":"4cb4d3802f0f957bf15848c176357b2db47f0ade924f95d0bf6bc3b1241210f8"} Apr 16 18:51:03.720647 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:03.720650 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" event={"ID":"9de2403a-5164-4b17-8dfc-6cd3c579c805","Type":"ContainerStarted","Data":"9a2b516d0f53086c70f11aaa036fd688caf731bab848365f9f7ab9fbc7d68c74"} Apr 16 18:51:06.902250 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:06.902225 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" Apr 16 18:51:07.039943 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.039858 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92-kserve-provision-location\") pod \"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92\" (UID: \"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92\") " Apr 16 18:51:07.040217 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.040174 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" (UID: "07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:51:07.141247 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.141217 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:51:07.732792 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.732752 2578 generic.go:358] "Generic (PLEG): container finished" podID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerID="7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9" exitCode=0 Apr 16 18:51:07.732975 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.732834 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" Apr 16 18:51:07.732975 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.732842 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" event={"ID":"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92","Type":"ContainerDied","Data":"7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9"} Apr 16 18:51:07.732975 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.732890 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w" event={"ID":"07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92","Type":"ContainerDied","Data":"165c9e3958a13efc51b8c7e06d57d23021d4334e827cb9dc99a5e70daef45ea4"} Apr 16 18:51:07.732975 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.732912 2578 scope.go:117] "RemoveContainer" containerID="7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9" Apr 16 18:51:07.741124 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.741107 2578 scope.go:117] "RemoveContainer" containerID="addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c" Apr 16 18:51:07.747644 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.747623 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w"] Apr 16 18:51:07.748626 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.748608 2578 scope.go:117] "RemoveContainer" containerID="7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9" Apr 16 18:51:07.748922 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:51:07.748901 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9\": container with ID starting with 7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9 not found: ID does not exist" containerID="7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9" Apr 16 18:51:07.749013 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.748931 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9"} err="failed to get container status \"7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9\": rpc error: code = NotFound desc = could not find container \"7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9\": container with ID starting with 7682d76bbf73a89d06561ccba88b532d4dae34641c9056fcfab8b568ee045bc9 not found: ID does not exist" Apr 16 18:51:07.749013 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.748989 2578 scope.go:117] "RemoveContainer" containerID="addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c" Apr 16 18:51:07.749860 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:51:07.749806 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c\": container with ID starting with addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c not found: ID does not exist" containerID="addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c" Apr 16 18:51:07.749960 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.749871 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c"} err="failed to get container status \"addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c\": rpc error: code = NotFound desc = could not find container \"addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c\": container with ID starting with addf1089e72cccf6fa8445ed1c46df42499242ae0eb1fa6d7c91fe4046c37f7c not found: ID does not exist" Apr 16 18:51:07.750939 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:07.750922 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-55fdc46497-xgg8w"] Apr 16 18:51:08.737532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:08.737500 2578 generic.go:358] "Generic (PLEG): container finished" podID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerID="4cb4d3802f0f957bf15848c176357b2db47f0ade924f95d0bf6bc3b1241210f8" exitCode=0 Apr 16 18:51:08.737983 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:08.737575 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" event={"ID":"9de2403a-5164-4b17-8dfc-6cd3c579c805","Type":"ContainerDied","Data":"4cb4d3802f0f957bf15848c176357b2db47f0ade924f95d0bf6bc3b1241210f8"} Apr 16 18:51:08.738820 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:08.738805 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:51:09.649761 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:09.649723 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" path="/var/lib/kubelet/pods/07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92/volumes" Apr 16 18:51:12.753804 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:12.753771 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" event={"ID":"9de2403a-5164-4b17-8dfc-6cd3c579c805","Type":"ContainerStarted","Data":"8dc1c074e01b00f53a69b0d36449f14b29a9a020449cacbdc0327b48cc97173d"} Apr 16 18:51:12.754267 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:12.754054 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" Apr 16 18:51:12.755427 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:12.755403 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 18:51:12.770695 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:12.770645 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" podStartSLOduration=7.215841765 podStartE2EDuration="10.770632369s" podCreationTimestamp="2026-04-16 18:51:02 +0000 UTC" firstStartedPulling="2026-04-16 18:51:08.738934332 +0000 UTC m=+2863.682547366" lastFinishedPulling="2026-04-16 18:51:12.293724933 +0000 UTC m=+2867.237337970" observedRunningTime="2026-04-16 18:51:12.768917703 +0000 UTC m=+2867.712530769" watchObservedRunningTime="2026-04-16 18:51:12.770632369 +0000 UTC m=+2867.714245424" Apr 16 18:51:13.757140 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:13.757101 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 18:51:23.757541 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:23.757500 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 18:51:33.758460 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:33.758428 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" Apr 16 18:51:53.556206 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.556159 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw"] Apr 16 18:51:53.556810 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.556518 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="kserve-container" containerID="cri-o://8dc1c074e01b00f53a69b0d36449f14b29a9a020449cacbdc0327b48cc97173d" gracePeriod=30 Apr 16 18:51:53.625898 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.625858 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj"] Apr 16 18:51:53.626275 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.626259 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="storage-initializer" Apr 16 18:51:53.626369 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.626280 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="storage-initializer" Apr 16 18:51:53.626369 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.626295 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" Apr 16 18:51:53.626369 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.626300 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" Apr 16 18:51:53.626369 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.626368 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="07fdec94-8e25-44cc-a7c8-5e0fb3fb8b92" containerName="kserve-container" Apr 16 18:51:53.630878 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.630857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" Apr 16 18:51:53.636839 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.636813 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj"] Apr 16 18:51:53.729597 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.729559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab57e36-d3a8-44ba-bb7f-4847aa27fd42-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj\" (UID: \"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" Apr 16 18:51:53.830780 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.830685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab57e36-d3a8-44ba-bb7f-4847aa27fd42-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj\" (UID: \"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" Apr 16 18:51:53.831115 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.831091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab57e36-d3a8-44ba-bb7f-4847aa27fd42-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj\" (UID: \"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" Apr 16 18:51:53.942459 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:53.942426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" Apr 16 18:51:54.056396 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:54.056255 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj"] Apr 16 18:51:54.059029 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:51:54.059001 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab57e36_d3a8_44ba_bb7f_4847aa27fd42.slice/crio-5d96af8ff5b7bb2b1ce9c2457e24eeb5850734c55b9c432b8268e0d22ed49662 WatchSource:0}: Error finding container 5d96af8ff5b7bb2b1ce9c2457e24eeb5850734c55b9c432b8268e0d22ed49662: Status 404 returned error can't find the container with id 5d96af8ff5b7bb2b1ce9c2457e24eeb5850734c55b9c432b8268e0d22ed49662 Apr 16 18:51:54.880326 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:54.880294 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" event={"ID":"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42","Type":"ContainerStarted","Data":"f1356d74747ee4c7ac5d3252ed07efda0c6933eb001982252e472e704535fed5"} Apr 16 18:51:54.880326 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:54.880332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" event={"ID":"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42","Type":"ContainerStarted","Data":"5d96af8ff5b7bb2b1ce9c2457e24eeb5850734c55b9c432b8268e0d22ed49662"} Apr 16 18:51:58.893798 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:58.893770 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerID="f1356d74747ee4c7ac5d3252ed07efda0c6933eb001982252e472e704535fed5" exitCode=0 Apr 16 18:51:58.894166 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:58.893854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" event={"ID":"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42","Type":"ContainerDied","Data":"f1356d74747ee4c7ac5d3252ed07efda0c6933eb001982252e472e704535fed5"} Apr 16 18:51:59.898510 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:59.898479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" event={"ID":"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42","Type":"ContainerStarted","Data":"73d1ee197cfc2ba2b6c9c356436ec65ea24f238fbf1f81a3120b861a5aab395b"} Apr 16 18:51:59.899011 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:59.898777 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" Apr 16 18:51:59.900100 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:59.900072 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" podUID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 18:51:59.914866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:51:59.914830 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" podStartSLOduration=6.914816942 podStartE2EDuration="6.914816942s" podCreationTimestamp="2026-04-16 18:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:51:59.913252823 +0000 UTC m=+2914.856865882" watchObservedRunningTime="2026-04-16 18:51:59.914816942 +0000 UTC m=+2914.858430076" Apr 16 18:52:00.902100 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:00.902062 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" podUID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 16 18:52:10.903321 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:10.903290 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" Apr 16 18:52:23.757360 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:23.757316 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 16 18:52:23.971077 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:23.971047 2578 generic.go:358] "Generic (PLEG): container finished" podID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerID="8dc1c074e01b00f53a69b0d36449f14b29a9a020449cacbdc0327b48cc97173d" exitCode=137 Apr 16 18:52:23.971250 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:23.971083 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" event={"ID":"9de2403a-5164-4b17-8dfc-6cd3c579c805","Type":"ContainerDied","Data":"8dc1c074e01b00f53a69b0d36449f14b29a9a020449cacbdc0327b48cc97173d"} Apr 16 18:52:24.191782 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.191760 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" Apr 16 18:52:24.286399 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.286295 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9de2403a-5164-4b17-8dfc-6cd3c579c805-kserve-provision-location\") pod \"9de2403a-5164-4b17-8dfc-6cd3c579c805\" (UID: \"9de2403a-5164-4b17-8dfc-6cd3c579c805\") " Apr 16 18:52:24.297379 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.297346 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9de2403a-5164-4b17-8dfc-6cd3c579c805-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9de2403a-5164-4b17-8dfc-6cd3c579c805" (UID: "9de2403a-5164-4b17-8dfc-6cd3c579c805"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:24.387406 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.387370 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9de2403a-5164-4b17-8dfc-6cd3c579c805-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:52:24.569468 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.569378 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj"] Apr 16 18:52:24.569751 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.569705 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" podUID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerName="kserve-container" containerID="cri-o://73d1ee197cfc2ba2b6c9c356436ec65ea24f238fbf1f81a3120b861a5aab395b" gracePeriod=30 Apr 16 18:52:24.633038 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.633002 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc"] Apr 16 18:52:24.633418 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.633404 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="storage-initializer" Apr 16 18:52:24.633487 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.633422 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="storage-initializer" Apr 16 18:52:24.633487 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.633439 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="kserve-container" Apr 16 18:52:24.633487 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.633445 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="kserve-container" Apr 16 18:52:24.633609 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.633504 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" containerName="kserve-container" Apr 16 18:52:24.636697 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.636681 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" Apr 16 18:52:24.643738 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.643713 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc"] Apr 16 18:52:24.689680 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.689640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc3e9fc6-5a61-4c6f-9088-875636ffed68-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-ztjlc\" (UID: \"bc3e9fc6-5a61-4c6f-9088-875636ffed68\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" Apr 16 18:52:24.790387 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.790347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc3e9fc6-5a61-4c6f-9088-875636ffed68-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-ztjlc\" (UID: \"bc3e9fc6-5a61-4c6f-9088-875636ffed68\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" Apr 16 18:52:24.790833 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.790781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc3e9fc6-5a61-4c6f-9088-875636ffed68-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-ztjlc\" (UID: \"bc3e9fc6-5a61-4c6f-9088-875636ffed68\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" Apr 16 18:52:24.948331 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.948240 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" Apr 16 18:52:24.976741 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.976711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" event={"ID":"9de2403a-5164-4b17-8dfc-6cd3c579c805","Type":"ContainerDied","Data":"9a2b516d0f53086c70f11aaa036fd688caf731bab848365f9f7ab9fbc7d68c74"} Apr 16 18:52:24.976895 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.976759 2578 scope.go:117] "RemoveContainer" containerID="8dc1c074e01b00f53a69b0d36449f14b29a9a020449cacbdc0327b48cc97173d" Apr 16 18:52:24.976895 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.976772 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw" Apr 16 18:52:24.986932 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:24.986910 2578 scope.go:117] "RemoveContainer" containerID="4cb4d3802f0f957bf15848c176357b2db47f0ade924f95d0bf6bc3b1241210f8" Apr 16 18:52:25.001803 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:25.001775 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw"] Apr 16 18:52:25.007078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:25.007055 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-d9lmw"] Apr 16 18:52:25.071235 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:25.071204 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc"] Apr 16 18:52:25.073876 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:52:25.073842 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3e9fc6_5a61_4c6f_9088_875636ffed68.slice/crio-050eab03ce4ada781ab9edec5c76e9e26aabb04497616055a86c180d1b22337d WatchSource:0}: Error finding container 050eab03ce4ada781ab9edec5c76e9e26aabb04497616055a86c180d1b22337d: Status 404 returned error can't find the container with id 050eab03ce4ada781ab9edec5c76e9e26aabb04497616055a86c180d1b22337d Apr 16 18:52:25.648597 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:25.648566 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de2403a-5164-4b17-8dfc-6cd3c579c805" path="/var/lib/kubelet/pods/9de2403a-5164-4b17-8dfc-6cd3c579c805/volumes" Apr 16 18:52:25.980903 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:25.980872 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" event={"ID":"bc3e9fc6-5a61-4c6f-9088-875636ffed68","Type":"ContainerStarted","Data":"a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a"} Apr 16 18:52:25.980903 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:25.980909 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" event={"ID":"bc3e9fc6-5a61-4c6f-9088-875636ffed68","Type":"ContainerStarted","Data":"050eab03ce4ada781ab9edec5c76e9e26aabb04497616055a86c180d1b22337d"} Apr 16 18:52:28.991591 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:28.991558 2578 generic.go:358] "Generic (PLEG): container finished" podID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerID="a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a" exitCode=0 Apr 16 18:52:28.992019 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:28.991634 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" event={"ID":"bc3e9fc6-5a61-4c6f-9088-875636ffed68","Type":"ContainerDied","Data":"a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a"} Apr 16 18:52:54.593098 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:52:54.593052 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3e9fc6_5a61_4c6f_9088_875636ffed68.slice/crio-conmon-a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3e9fc6_5a61_4c6f_9088_875636ffed68.slice/crio-a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:52:54.593694 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:52:54.593658 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab57e36_d3a8_44ba_bb7f_4847aa27fd42.slice/crio-5d96af8ff5b7bb2b1ce9c2457e24eeb5850734c55b9c432b8268e0d22ed49662\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3e9fc6_5a61_4c6f_9088_875636ffed68.slice/crio-conmon-a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:52:54.594347 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:52:54.593169 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab57e36_d3a8_44ba_bb7f_4847aa27fd42.slice/crio-5d96af8ff5b7bb2b1ce9c2457e24eeb5850734c55b9c432b8268e0d22ed49662\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3e9fc6_5a61_4c6f_9088_875636ffed68.slice/crio-a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3e9fc6_5a61_4c6f_9088_875636ffed68.slice/crio-conmon-a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:52:54.596969 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:52:54.593064 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3e9fc6_5a61_4c6f_9088_875636ffed68.slice/crio-a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3e9fc6_5a61_4c6f_9088_875636ffed68.slice/crio-conmon-a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a.scope\": RecentStats: unable to find data in memory cache]" Apr 16 18:52:55.103813 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:55.103778 2578 generic.go:358] "Generic (PLEG): container finished" podID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerID="73d1ee197cfc2ba2b6c9c356436ec65ea24f238fbf1f81a3120b861a5aab395b" exitCode=137 Apr 16 18:52:55.103962 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:55.103836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" event={"ID":"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42","Type":"ContainerDied","Data":"73d1ee197cfc2ba2b6c9c356436ec65ea24f238fbf1f81a3120b861a5aab395b"} Apr 16 18:52:55.272336 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:55.272309 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" Apr 16 18:52:55.375976 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:55.375896 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab57e36-d3a8-44ba-bb7f-4847aa27fd42-kserve-provision-location\") pod \"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42\" (UID: \"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42\") " Apr 16 18:52:55.379956 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:55.379910 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ab57e36-d3a8-44ba-bb7f-4847aa27fd42-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" (UID: "2ab57e36-d3a8-44ba-bb7f-4847aa27fd42"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:55.477053 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:55.477010 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ab57e36-d3a8-44ba-bb7f-4847aa27fd42-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:52:56.109427 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:56.109385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" event={"ID":"2ab57e36-d3a8-44ba-bb7f-4847aa27fd42","Type":"ContainerDied","Data":"5d96af8ff5b7bb2b1ce9c2457e24eeb5850734c55b9c432b8268e0d22ed49662"} Apr 16 18:52:56.109907 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:56.109442 2578 scope.go:117] "RemoveContainer" containerID="73d1ee197cfc2ba2b6c9c356436ec65ea24f238fbf1f81a3120b861a5aab395b" Apr 16 18:52:56.109907 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:56.109464 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj" Apr 16 18:52:56.120528 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:56.120486 2578 scope.go:117] "RemoveContainer" containerID="f1356d74747ee4c7ac5d3252ed07efda0c6933eb001982252e472e704535fed5" Apr 16 18:52:56.125470 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:56.125444 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj"] Apr 16 18:52:56.128951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:56.128919 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-5fjpj"] Apr 16 18:52:57.650322 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:52:57.649904 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" path="/var/lib/kubelet/pods/2ab57e36-d3a8-44ba-bb7f-4847aa27fd42/volumes" Apr 16 18:54:23.405535 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:23.405502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" event={"ID":"bc3e9fc6-5a61-4c6f-9088-875636ffed68","Type":"ContainerStarted","Data":"98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996"} Apr 16 18:54:23.405977 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:23.405736 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" Apr 16 18:54:23.406862 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:23.406839 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" podUID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 16 18:54:23.425461 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:23.425415 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" podStartSLOduration=5.172330664 podStartE2EDuration="1m59.425403383s" podCreationTimestamp="2026-04-16 18:52:24 +0000 UTC" firstStartedPulling="2026-04-16 18:52:28.992634168 +0000 UTC m=+2943.936247205" lastFinishedPulling="2026-04-16 18:54:23.245706889 +0000 UTC m=+3058.189319924" observedRunningTime="2026-04-16 18:54:23.423745831 +0000 UTC m=+3058.367358888" watchObservedRunningTime="2026-04-16 18:54:23.425403383 +0000 UTC m=+3058.369016442" Apr 16 18:54:24.408318 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:24.408277 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" podUID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 16 18:54:34.409891 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:34.409857 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" Apr 16 18:54:36.095748 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.095716 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc"] Apr 16 18:54:36.096113 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.096005 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" podUID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerName="kserve-container" containerID="cri-o://98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996" gracePeriod=30 Apr 16 18:54:36.210637 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.210600 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6"] Apr 16 18:54:36.210962 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.210949 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerName="kserve-container" Apr 16 18:54:36.211007 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.210964 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerName="kserve-container" Apr 16 18:54:36.211007 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.210982 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerName="storage-initializer" Apr 16 18:54:36.211007 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.210987 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerName="storage-initializer" Apr 16 18:54:36.211111 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.211042 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ab57e36-d3a8-44ba-bb7f-4847aa27fd42" containerName="kserve-container" Apr 16 18:54:36.239506 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.239478 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6"] Apr 16 18:54:36.239655 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.239575 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" Apr 16 18:54:36.284590 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.284552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/447029d4-b1a3-4f68-9c17-376eec45be4c-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-66sb6\" (UID: \"447029d4-b1a3-4f68-9c17-376eec45be4c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" Apr 16 18:54:36.385790 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.385696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/447029d4-b1a3-4f68-9c17-376eec45be4c-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-66sb6\" (UID: \"447029d4-b1a3-4f68-9c17-376eec45be4c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" Apr 16 18:54:36.386071 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.386053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/447029d4-b1a3-4f68-9c17-376eec45be4c-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-66sb6\" (UID: \"447029d4-b1a3-4f68-9c17-376eec45be4c\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" Apr 16 18:54:36.549551 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.549513 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" Apr 16 18:54:36.707619 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:36.707590 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6"] Apr 16 18:54:36.712170 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:54:36.712141 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod447029d4_b1a3_4f68_9c17_376eec45be4c.slice/crio-38cda02dd6d3d227262047693b167f44f682a0aa44c28927921d32eb13a86227 WatchSource:0}: Error finding container 38cda02dd6d3d227262047693b167f44f682a0aa44c28927921d32eb13a86227: Status 404 returned error can't find the container with id 38cda02dd6d3d227262047693b167f44f682a0aa44c28927921d32eb13a86227 Apr 16 18:54:37.446601 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:37.446566 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" event={"ID":"447029d4-b1a3-4f68-9c17-376eec45be4c","Type":"ContainerStarted","Data":"70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783"} Apr 16 18:54:37.446601 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:37.446609 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" event={"ID":"447029d4-b1a3-4f68-9c17-376eec45be4c","Type":"ContainerStarted","Data":"38cda02dd6d3d227262047693b167f44f682a0aa44c28927921d32eb13a86227"} Apr 16 18:54:39.045759 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.045737 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" Apr 16 18:54:39.112259 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.112153 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc3e9fc6-5a61-4c6f-9088-875636ffed68-kserve-provision-location\") pod \"bc3e9fc6-5a61-4c6f-9088-875636ffed68\" (UID: \"bc3e9fc6-5a61-4c6f-9088-875636ffed68\") " Apr 16 18:54:39.112555 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.112533 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc3e9fc6-5a61-4c6f-9088-875636ffed68-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc3e9fc6-5a61-4c6f-9088-875636ffed68" (UID: "bc3e9fc6-5a61-4c6f-9088-875636ffed68"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:54:39.213460 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.213426 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc3e9fc6-5a61-4c6f-9088-875636ffed68-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:54:39.452896 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.452858 2578 generic.go:358] "Generic (PLEG): container finished" podID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerID="98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996" exitCode=0 Apr 16 18:54:39.453082 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.452914 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" Apr 16 18:54:39.453082 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.452927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" event={"ID":"bc3e9fc6-5a61-4c6f-9088-875636ffed68","Type":"ContainerDied","Data":"98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996"} Apr 16 18:54:39.453082 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.452969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc" event={"ID":"bc3e9fc6-5a61-4c6f-9088-875636ffed68","Type":"ContainerDied","Data":"050eab03ce4ada781ab9edec5c76e9e26aabb04497616055a86c180d1b22337d"} Apr 16 18:54:39.453082 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.452986 2578 scope.go:117] "RemoveContainer" containerID="98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996" Apr 16 18:54:39.461910 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.461883 2578 scope.go:117] "RemoveContainer" containerID="a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a" Apr 16 18:54:39.468897 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.468878 2578 scope.go:117] "RemoveContainer" containerID="98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996" Apr 16 18:54:39.469138 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:54:39.469114 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996\": container with ID starting with 98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996 not found: ID does not exist" containerID="98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996" Apr 16 18:54:39.469241 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.469148 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996"} err="failed to get container status \"98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996\": rpc error: code = NotFound desc = could not find container \"98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996\": container with ID starting with 98634607e45847ec6200fb2c8e36d907c7c9c9a14d63b43f234412cdbee72996 not found: ID does not exist" Apr 16 18:54:39.469241 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.469165 2578 scope.go:117] "RemoveContainer" containerID="a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a" Apr 16 18:54:39.469431 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:54:39.469414 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a\": container with ID starting with a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a not found: ID does not exist" containerID="a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a" Apr 16 18:54:39.469473 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.469438 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a"} err="failed to get container status \"a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a\": rpc error: code = NotFound desc = could not find container \"a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a\": container with ID starting with a8e5ae908eb5efff20cbe402cddcbed22aa88d49e4ce3d8556d36236b6df035a not found: ID does not exist" Apr 16 18:54:39.474254 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.474233 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc"] Apr 16 18:54:39.479863 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.479844 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ztjlc"] Apr 16 18:54:39.648291 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:39.648259 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" path="/var/lib/kubelet/pods/bc3e9fc6-5a61-4c6f-9088-875636ffed68/volumes" Apr 16 18:54:41.461690 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:41.461654 2578 generic.go:358] "Generic (PLEG): container finished" podID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerID="70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783" exitCode=0 Apr 16 18:54:41.462124 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:54:41.461727 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" event={"ID":"447029d4-b1a3-4f68-9c17-376eec45be4c","Type":"ContainerDied","Data":"70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783"} Apr 16 18:55:05.547883 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:05.547846 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" event={"ID":"447029d4-b1a3-4f68-9c17-376eec45be4c","Type":"ContainerStarted","Data":"b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e"} Apr 16 18:55:05.548382 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:05.548216 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" Apr 16 18:55:05.549534 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:05.549507 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 18:55:05.567204 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:05.567143 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podStartSLOduration=5.946818444 podStartE2EDuration="29.567131337s" podCreationTimestamp="2026-04-16 18:54:36 +0000 UTC" firstStartedPulling="2026-04-16 18:54:41.462933018 +0000 UTC m=+3076.406546053" lastFinishedPulling="2026-04-16 18:55:05.083245911 +0000 UTC m=+3100.026858946" observedRunningTime="2026-04-16 18:55:05.565834634 +0000 UTC m=+3100.509447689" watchObservedRunningTime="2026-04-16 18:55:05.567131337 +0000 UTC m=+3100.510744394" Apr 16 18:55:06.551834 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:06.551797 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 18:55:16.552726 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:16.552683 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 18:55:26.552794 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:26.552751 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 18:55:36.552527 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:36.552437 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 18:55:46.552495 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:46.552448 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 18:55:56.552514 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:55:56.552473 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 18:56:06.553402 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:06.553375 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" Apr 16 18:56:16.333581 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.333545 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6"] Apr 16 18:56:16.333996 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.333806 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" containerID="cri-o://b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e" gracePeriod=30 Apr 16 18:56:16.430408 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.430371 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p"] Apr 16 18:56:16.430719 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.430707 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerName="storage-initializer" Apr 16 18:56:16.430765 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.430721 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerName="storage-initializer" Apr 16 18:56:16.430765 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.430734 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerName="kserve-container" Apr 16 18:56:16.430765 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.430740 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerName="kserve-container" Apr 16 18:56:16.430896 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.430785 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc3e9fc6-5a61-4c6f-9088-875636ffed68" containerName="kserve-container" Apr 16 18:56:16.433844 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.433828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" Apr 16 18:56:16.442887 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.442867 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p"] Apr 16 18:56:16.459874 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.459848 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8fc6e24-e812-4ead-ace8-852367234e82-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p\" (UID: \"d8fc6e24-e812-4ead-ace8-852367234e82\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" Apr 16 18:56:16.552621 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.552581 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 16 18:56:16.560220 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.560179 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8fc6e24-e812-4ead-ace8-852367234e82-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p\" (UID: \"d8fc6e24-e812-4ead-ace8-852367234e82\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" Apr 16 18:56:16.560542 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.560526 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8fc6e24-e812-4ead-ace8-852367234e82-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p\" (UID: \"d8fc6e24-e812-4ead-ace8-852367234e82\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" Apr 16 18:56:16.743690 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.743666 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" Apr 16 18:56:16.872302 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.872271 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p"] Apr 16 18:56:16.875704 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:56:16.875676 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8fc6e24_e812_4ead_ace8_852367234e82.slice/crio-b621e95ac44757137dec06e6f004c9173127f5187592fe1802530f9795740146 WatchSource:0}: Error finding container b621e95ac44757137dec06e6f004c9173127f5187592fe1802530f9795740146: Status 404 returned error can't find the container with id b621e95ac44757137dec06e6f004c9173127f5187592fe1802530f9795740146 Apr 16 18:56:16.877548 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:16.877532 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:56:17.753758 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:17.753724 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" event={"ID":"d8fc6e24-e812-4ead-ace8-852367234e82","Type":"ContainerStarted","Data":"03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7"} Apr 16 18:56:17.754164 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:17.753763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" event={"ID":"d8fc6e24-e812-4ead-ace8-852367234e82","Type":"ContainerStarted","Data":"b621e95ac44757137dec06e6f004c9173127f5187592fe1802530f9795740146"} Apr 16 18:56:19.967133 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:19.967111 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" Apr 16 18:56:20.090580 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.090495 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/447029d4-b1a3-4f68-9c17-376eec45be4c-kserve-provision-location\") pod \"447029d4-b1a3-4f68-9c17-376eec45be4c\" (UID: \"447029d4-b1a3-4f68-9c17-376eec45be4c\") " Apr 16 18:56:20.090735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.090708 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/447029d4-b1a3-4f68-9c17-376eec45be4c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "447029d4-b1a3-4f68-9c17-376eec45be4c" (UID: "447029d4-b1a3-4f68-9c17-376eec45be4c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:20.191725 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.191687 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/447029d4-b1a3-4f68-9c17-376eec45be4c-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:56:20.763900 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.763862 2578 generic.go:358] "Generic (PLEG): container finished" podID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerID="b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e" exitCode=0 Apr 16 18:56:20.764073 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.763905 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" event={"ID":"447029d4-b1a3-4f68-9c17-376eec45be4c","Type":"ContainerDied","Data":"b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e"} Apr 16 18:56:20.764073 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.763925 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" Apr 16 18:56:20.764073 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.763940 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6" event={"ID":"447029d4-b1a3-4f68-9c17-376eec45be4c","Type":"ContainerDied","Data":"38cda02dd6d3d227262047693b167f44f682a0aa44c28927921d32eb13a86227"} Apr 16 18:56:20.764073 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.763955 2578 scope.go:117] "RemoveContainer" containerID="b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e" Apr 16 18:56:20.772444 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.772427 2578 scope.go:117] "RemoveContainer" containerID="70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783" Apr 16 18:56:20.779338 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.779324 2578 scope.go:117] "RemoveContainer" containerID="b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e" Apr 16 18:56:20.779565 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:56:20.779547 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e\": container with ID starting with b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e not found: ID does not exist" containerID="b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e" Apr 16 18:56:20.779616 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.779571 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e"} err="failed to get container status \"b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e\": rpc error: code = NotFound desc = could not find container \"b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e\": container with ID starting with b69ac357f128622c37f6b22031d6a748f183bc1c5b248e3925ed36a53e4f556e not found: ID does not exist" Apr 16 18:56:20.779616 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.779587 2578 scope.go:117] "RemoveContainer" containerID="70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783" Apr 16 18:56:20.779782 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:56:20.779764 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783\": container with ID starting with 70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783 not found: ID does not exist" containerID="70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783" Apr 16 18:56:20.779818 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.779789 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783"} err="failed to get container status \"70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783\": rpc error: code = NotFound desc = could not find container \"70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783\": container with ID starting with 70825889c12c0919a8b270adf1d14001e7dbf134cd0f6658b255c593981a5783 not found: ID does not exist" Apr 16 18:56:20.787593 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.787569 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6"] Apr 16 18:56:20.791881 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:20.791853 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-66sb6"] Apr 16 18:56:21.647407 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:21.647369 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" path="/var/lib/kubelet/pods/447029d4-b1a3-4f68-9c17-376eec45be4c/volumes" Apr 16 18:56:21.768078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:21.768047 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8fc6e24-e812-4ead-ace8-852367234e82" containerID="03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7" exitCode=0 Apr 16 18:56:21.768280 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:21.768121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" event={"ID":"d8fc6e24-e812-4ead-ace8-852367234e82","Type":"ContainerDied","Data":"03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7"} Apr 16 18:56:22.773503 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:22.773470 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" event={"ID":"d8fc6e24-e812-4ead-ace8-852367234e82","Type":"ContainerStarted","Data":"0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921"} Apr 16 18:56:22.773918 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:22.773757 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" Apr 16 18:56:22.790936 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:22.790889 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" podStartSLOduration=6.790875999 podStartE2EDuration="6.790875999s" podCreationTimestamp="2026-04-16 18:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:56:22.789233223 +0000 UTC m=+3177.732846279" watchObservedRunningTime="2026-04-16 18:56:22.790875999 +0000 UTC m=+3177.734489055" Apr 16 18:56:53.777858 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:56:53.777815 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" podUID="d8fc6e24-e812-4ead-ace8-852367234e82" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.59:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.59:8080: connect: connection refused" Apr 16 18:57:03.780326 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:03.780292 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" Apr 16 18:57:06.522490 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.522407 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p"] Apr 16 18:57:06.522901 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.522641 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" podUID="d8fc6e24-e812-4ead-ace8-852367234e82" containerName="kserve-container" containerID="cri-o://0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921" gracePeriod=30 Apr 16 18:57:06.591984 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.591950 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp"] Apr 16 18:57:06.592291 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.592278 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" Apr 16 18:57:06.592291 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.592293 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" Apr 16 18:57:06.592404 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.592308 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="storage-initializer" Apr 16 18:57:06.592404 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.592314 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="storage-initializer" Apr 16 18:57:06.592404 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.592368 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="447029d4-b1a3-4f68-9c17-376eec45be4c" containerName="kserve-container" Apr 16 18:57:06.595029 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.595014 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" Apr 16 18:57:06.606571 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.606546 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp"] Apr 16 18:57:06.674740 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.674704 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27bec7ac-413a-4832-8c4b-65a7f784d04d-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-wm5dp\" (UID: \"27bec7ac-413a-4832-8c4b-65a7f784d04d\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" Apr 16 18:57:06.776319 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.776232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27bec7ac-413a-4832-8c4b-65a7f784d04d-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-wm5dp\" (UID: \"27bec7ac-413a-4832-8c4b-65a7f784d04d\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" Apr 16 18:57:06.776600 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.776578 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27bec7ac-413a-4832-8c4b-65a7f784d04d-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-wm5dp\" (UID: \"27bec7ac-413a-4832-8c4b-65a7f784d04d\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" Apr 16 18:57:06.904399 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:06.904369 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" Apr 16 18:57:07.022313 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:07.022277 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp"] Apr 16 18:57:07.025269 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:57:07.025238 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27bec7ac_413a_4832_8c4b_65a7f784d04d.slice/crio-a35ac7dc12e5af8d935259778f03ff1d1cabd5182d46df67f71795f0b320830b WatchSource:0}: Error finding container a35ac7dc12e5af8d935259778f03ff1d1cabd5182d46df67f71795f0b320830b: Status 404 returned error can't find the container with id a35ac7dc12e5af8d935259778f03ff1d1cabd5182d46df67f71795f0b320830b Apr 16 18:57:07.907823 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:07.907785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" event={"ID":"27bec7ac-413a-4832-8c4b-65a7f784d04d","Type":"ContainerStarted","Data":"d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae"} Apr 16 18:57:07.907823 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:07.907826 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" event={"ID":"27bec7ac-413a-4832-8c4b-65a7f784d04d","Type":"ContainerStarted","Data":"a35ac7dc12e5af8d935259778f03ff1d1cabd5182d46df67f71795f0b320830b"} Apr 16 18:57:10.918515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:10.918431 2578 generic.go:358] "Generic (PLEG): container finished" podID="27bec7ac-413a-4832-8c4b-65a7f784d04d" containerID="d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae" exitCode=0 Apr 16 18:57:10.918515 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:10.918501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" event={"ID":"27bec7ac-413a-4832-8c4b-65a7f784d04d","Type":"ContainerDied","Data":"d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae"} Apr 16 18:57:11.923526 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:11.923494 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" event={"ID":"27bec7ac-413a-4832-8c4b-65a7f784d04d","Type":"ContainerStarted","Data":"06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d"} Apr 16 18:57:11.923944 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:11.923700 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" Apr 16 18:57:11.943021 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:11.942976 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" podStartSLOduration=5.942962095 podStartE2EDuration="5.942962095s" podCreationTimestamp="2026-04-16 18:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:57:11.940333967 +0000 UTC m=+3226.883947024" watchObservedRunningTime="2026-04-16 18:57:11.942962095 +0000 UTC m=+3226.886575151" Apr 16 18:57:13.062357 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.062336 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" Apr 16 18:57:13.132243 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.132142 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8fc6e24-e812-4ead-ace8-852367234e82-kserve-provision-location\") pod \"d8fc6e24-e812-4ead-ace8-852367234e82\" (UID: \"d8fc6e24-e812-4ead-ace8-852367234e82\") " Apr 16 18:57:13.132496 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.132477 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8fc6e24-e812-4ead-ace8-852367234e82-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d8fc6e24-e812-4ead-ace8-852367234e82" (UID: "d8fc6e24-e812-4ead-ace8-852367234e82"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:13.233705 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.233663 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8fc6e24-e812-4ead-ace8-852367234e82-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:57:13.931876 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.931780 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8fc6e24-e812-4ead-ace8-852367234e82" containerID="0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921" exitCode=0 Apr 16 18:57:13.931876 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.931842 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" event={"ID":"d8fc6e24-e812-4ead-ace8-852367234e82","Type":"ContainerDied","Data":"0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921"} Apr 16 18:57:13.931876 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.931867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" event={"ID":"d8fc6e24-e812-4ead-ace8-852367234e82","Type":"ContainerDied","Data":"b621e95ac44757137dec06e6f004c9173127f5187592fe1802530f9795740146"} Apr 16 18:57:13.931876 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.931871 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p" Apr 16 18:57:13.931876 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.931881 2578 scope.go:117] "RemoveContainer" containerID="0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921" Apr 16 18:57:13.939897 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.939651 2578 scope.go:117] "RemoveContainer" containerID="03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7" Apr 16 18:57:13.946824 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.946807 2578 scope.go:117] "RemoveContainer" containerID="0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921" Apr 16 18:57:13.946994 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.946976 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p"] Apr 16 18:57:13.947071 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:57:13.947038 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921\": container with ID starting with 0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921 not found: ID does not exist" containerID="0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921" Apr 16 18:57:13.947139 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.947063 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921"} err="failed to get container status \"0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921\": rpc error: code = NotFound desc = could not find container \"0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921\": container with ID starting with 0f036ef57cecc3cfd30099a233203cf79c0e575680979daf1e71315c18a15921 not found: ID does not exist" Apr 16 18:57:13.947139 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.947082 2578 scope.go:117] "RemoveContainer" containerID="03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7" Apr 16 18:57:13.947324 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:57:13.947306 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7\": container with ID starting with 03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7 not found: ID does not exist" containerID="03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7" Apr 16 18:57:13.947393 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.947327 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7"} err="failed to get container status \"03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7\": rpc error: code = NotFound desc = could not find container \"03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7\": container with ID starting with 03d68401aa46bd5da3e4a7e03c3337119bb5243bbef6b7857bd31fdfe5574fa7 not found: ID does not exist" Apr 16 18:57:13.950447 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:13.950428 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-j7t5p"] Apr 16 18:57:15.648078 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:15.648039 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8fc6e24-e812-4ead-ace8-852367234e82" path="/var/lib/kubelet/pods/d8fc6e24-e812-4ead-ace8-852367234e82/volumes" Apr 16 18:57:42.939777 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:42.939750 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" Apr 16 18:57:46.869460 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.869427 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp"] Apr 16 18:57:46.870021 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.869780 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" podUID="27bec7ac-413a-4832-8c4b-65a7f784d04d" containerName="kserve-container" containerID="cri-o://06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d" gracePeriod=30 Apr 16 18:57:46.943470 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.943436 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c"] Apr 16 18:57:46.943919 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.943900 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8fc6e24-e812-4ead-ace8-852367234e82" containerName="storage-initializer" Apr 16 18:57:46.943919 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.943921 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fc6e24-e812-4ead-ace8-852367234e82" containerName="storage-initializer" Apr 16 18:57:46.944079 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.943943 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d8fc6e24-e812-4ead-ace8-852367234e82" containerName="kserve-container" Apr 16 18:57:46.944079 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.943952 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fc6e24-e812-4ead-ace8-852367234e82" containerName="kserve-container" Apr 16 18:57:46.944079 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.944042 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d8fc6e24-e812-4ead-ace8-852367234e82" containerName="kserve-container" Apr 16 18:57:46.948497 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.948476 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" Apr 16 18:57:46.954906 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:46.954880 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c"] Apr 16 18:57:47.023929 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:47.023898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21812bb9-8c58-430b-92da-af2694fbe920-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-g5c7c\" (UID: \"21812bb9-8c58-430b-92da-af2694fbe920\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" Apr 16 18:57:47.124548 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:47.124462 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21812bb9-8c58-430b-92da-af2694fbe920-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-g5c7c\" (UID: \"21812bb9-8c58-430b-92da-af2694fbe920\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" Apr 16 18:57:47.124829 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:47.124812 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21812bb9-8c58-430b-92da-af2694fbe920-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-g5c7c\" (UID: \"21812bb9-8c58-430b-92da-af2694fbe920\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" Apr 16 18:57:47.258876 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:47.258847 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" Apr 16 18:57:47.381944 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:47.381921 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c"] Apr 16 18:57:47.384905 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:57:47.384864 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21812bb9_8c58_430b_92da_af2694fbe920.slice/crio-a4f6d9eb93ef4f9ebb0f616d6278997bad926cfcb7d4622947e1e7543decb0cb WatchSource:0}: Error finding container a4f6d9eb93ef4f9ebb0f616d6278997bad926cfcb7d4622947e1e7543decb0cb: Status 404 returned error can't find the container with id a4f6d9eb93ef4f9ebb0f616d6278997bad926cfcb7d4622947e1e7543decb0cb Apr 16 18:57:48.035796 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:48.035763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" event={"ID":"21812bb9-8c58-430b-92da-af2694fbe920","Type":"ContainerStarted","Data":"51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10"} Apr 16 18:57:48.035796 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:48.035798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" event={"ID":"21812bb9-8c58-430b-92da-af2694fbe920","Type":"ContainerStarted","Data":"a4f6d9eb93ef4f9ebb0f616d6278997bad926cfcb7d4622947e1e7543decb0cb"} Apr 16 18:57:52.049498 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:52.049466 2578 generic.go:358] "Generic (PLEG): container finished" podID="21812bb9-8c58-430b-92da-af2694fbe920" containerID="51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10" exitCode=0 Apr 16 18:57:52.049883 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:52.049540 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" event={"ID":"21812bb9-8c58-430b-92da-af2694fbe920","Type":"ContainerDied","Data":"51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10"} Apr 16 18:57:52.928356 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:52.928314 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" podUID="27bec7ac-413a-4832-8c4b-65a7f784d04d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.60:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.134.0.60:8080: connect: connection refused" Apr 16 18:57:53.054465 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:53.054434 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" event={"ID":"21812bb9-8c58-430b-92da-af2694fbe920","Type":"ContainerStarted","Data":"a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d"} Apr 16 18:57:53.054858 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:53.054723 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" Apr 16 18:57:53.056166 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:53.056140 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 18:57:53.071183 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:53.071141 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" podStartSLOduration=7.07112826 podStartE2EDuration="7.07112826s" podCreationTimestamp="2026-04-16 18:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:57:53.069519786 +0000 UTC m=+3268.013132842" watchObservedRunningTime="2026-04-16 18:57:53.07112826 +0000 UTC m=+3268.014741316" Apr 16 18:57:53.905709 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:53.905684 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" Apr 16 18:57:53.980550 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:53.980505 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27bec7ac-413a-4832-8c4b-65a7f784d04d-kserve-provision-location\") pod \"27bec7ac-413a-4832-8c4b-65a7f784d04d\" (UID: \"27bec7ac-413a-4832-8c4b-65a7f784d04d\") " Apr 16 18:57:53.980843 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:53.980821 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27bec7ac-413a-4832-8c4b-65a7f784d04d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "27bec7ac-413a-4832-8c4b-65a7f784d04d" (UID: "27bec7ac-413a-4832-8c4b-65a7f784d04d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:54.058756 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.058717 2578 generic.go:358] "Generic (PLEG): container finished" podID="27bec7ac-413a-4832-8c4b-65a7f784d04d" containerID="06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d" exitCode=0 Apr 16 18:57:54.059158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.058787 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" Apr 16 18:57:54.059158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.058794 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" event={"ID":"27bec7ac-413a-4832-8c4b-65a7f784d04d","Type":"ContainerDied","Data":"06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d"} Apr 16 18:57:54.059158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.058832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp" event={"ID":"27bec7ac-413a-4832-8c4b-65a7f784d04d","Type":"ContainerDied","Data":"a35ac7dc12e5af8d935259778f03ff1d1cabd5182d46df67f71795f0b320830b"} Apr 16 18:57:54.059158 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.058849 2578 scope.go:117] "RemoveContainer" containerID="06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d" Apr 16 18:57:54.059583 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.059545 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 18:57:54.066755 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.066737 2578 scope.go:117] "RemoveContainer" containerID="d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae" Apr 16 18:57:54.073861 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.073842 2578 scope.go:117] "RemoveContainer" containerID="06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d" Apr 16 18:57:54.074135 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:57:54.074117 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d\": container with ID starting with 06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d not found: ID does not exist" containerID="06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d" Apr 16 18:57:54.074207 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.074144 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d"} err="failed to get container status \"06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d\": rpc error: code = NotFound desc = could not find container \"06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d\": container with ID starting with 06315fe5a0b4e4bc0a45d3ada11dfba3679b5fea2f7f985bf2255511682f848d not found: ID does not exist" Apr 16 18:57:54.074207 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.074161 2578 scope.go:117] "RemoveContainer" containerID="d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae" Apr 16 18:57:54.074429 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:57:54.074411 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae\": container with ID starting with d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae not found: ID does not exist" containerID="d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae" Apr 16 18:57:54.074478 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.074435 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae"} err="failed to get container status \"d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae\": rpc error: code = NotFound desc = could not find container \"d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae\": container with ID starting with d7d4742ef1849911e5e6845ab6e5fad05f85487416b617a1369e384e216005ae not found: ID does not exist" Apr 16 18:57:54.078396 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.078377 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp"] Apr 16 18:57:54.081388 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.081337 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/27bec7ac-413a-4832-8c4b-65a7f784d04d-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:57:54.081712 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:54.081693 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-wm5dp"] Apr 16 18:57:55.648314 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:57:55.648274 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27bec7ac-413a-4832-8c4b-65a7f784d04d" path="/var/lib/kubelet/pods/27bec7ac-413a-4832-8c4b-65a7f784d04d/volumes" Apr 16 18:58:04.060259 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:04.060217 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 18:58:14.060322 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:14.060275 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 18:58:24.059735 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:24.059697 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 18:58:34.060358 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:34.060319 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 18:58:44.059955 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:44.059907 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.61:8080: connect: connection refused" Apr 16 18:58:54.060805 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:54.060769 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" Apr 16 18:58:57.063699 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.063668 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c"] Apr 16 18:58:57.064159 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.063920 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" containerID="cri-o://a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d" gracePeriod=30 Apr 16 18:58:57.374844 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.374761 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2"] Apr 16 18:58:57.375264 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.375247 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27bec7ac-413a-4832-8c4b-65a7f784d04d" containerName="storage-initializer" Apr 16 18:58:57.375367 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.375266 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bec7ac-413a-4832-8c4b-65a7f784d04d" containerName="storage-initializer" Apr 16 18:58:57.375367 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.375312 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27bec7ac-413a-4832-8c4b-65a7f784d04d" containerName="kserve-container" Apr 16 18:58:57.375367 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.375322 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bec7ac-413a-4832-8c4b-65a7f784d04d" containerName="kserve-container" Apr 16 18:58:57.375530 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.375398 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="27bec7ac-413a-4832-8c4b-65a7f784d04d" containerName="kserve-container" Apr 16 18:58:57.379588 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.379569 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" Apr 16 18:58:57.391023 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.390998 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2"] Apr 16 18:58:57.424129 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.424098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54d7c4e1-26b9-4569-962b-d75c5c565026-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2\" (UID: \"54d7c4e1-26b9-4569-962b-d75c5c565026\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" Apr 16 18:58:57.525366 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.525333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54d7c4e1-26b9-4569-962b-d75c5c565026-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2\" (UID: \"54d7c4e1-26b9-4569-962b-d75c5c565026\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" Apr 16 18:58:57.525664 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.525648 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54d7c4e1-26b9-4569-962b-d75c5c565026-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2\" (UID: \"54d7c4e1-26b9-4569-962b-d75c5c565026\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" Apr 16 18:58:57.689873 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.689772 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" Apr 16 18:58:57.806624 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:57.806530 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2"] Apr 16 18:58:57.809500 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:58:57.809476 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d7c4e1_26b9_4569_962b_d75c5c565026.slice/crio-8cf2f7c1f62bca1f9bb17dccf7b52e4f078ab611595eaa988fcda684eab67449 WatchSource:0}: Error finding container 8cf2f7c1f62bca1f9bb17dccf7b52e4f078ab611595eaa988fcda684eab67449: Status 404 returned error can't find the container with id 8cf2f7c1f62bca1f9bb17dccf7b52e4f078ab611595eaa988fcda684eab67449 Apr 16 18:58:58.251104 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:58.251064 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" event={"ID":"54d7c4e1-26b9-4569-962b-d75c5c565026","Type":"ContainerStarted","Data":"b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f"} Apr 16 18:58:58.251104 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:58:58.251103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" event={"ID":"54d7c4e1-26b9-4569-962b-d75c5c565026","Type":"ContainerStarted","Data":"8cf2f7c1f62bca1f9bb17dccf7b52e4f078ab611595eaa988fcda684eab67449"} Apr 16 18:59:00.697217 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:00.697182 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" Apr 16 18:59:00.752426 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:00.752395 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21812bb9-8c58-430b-92da-af2694fbe920-kserve-provision-location\") pod \"21812bb9-8c58-430b-92da-af2694fbe920\" (UID: \"21812bb9-8c58-430b-92da-af2694fbe920\") " Apr 16 18:59:00.752753 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:00.752728 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21812bb9-8c58-430b-92da-af2694fbe920-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "21812bb9-8c58-430b-92da-af2694fbe920" (UID: "21812bb9-8c58-430b-92da-af2694fbe920"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:00.853378 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:00.853295 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21812bb9-8c58-430b-92da-af2694fbe920-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:59:01.259685 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.259652 2578 generic.go:358] "Generic (PLEG): container finished" podID="21812bb9-8c58-430b-92da-af2694fbe920" containerID="a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d" exitCode=0 Apr 16 18:59:01.259888 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.259726 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" Apr 16 18:59:01.259888 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.259743 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" event={"ID":"21812bb9-8c58-430b-92da-af2694fbe920","Type":"ContainerDied","Data":"a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d"} Apr 16 18:59:01.259888 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.259789 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c" event={"ID":"21812bb9-8c58-430b-92da-af2694fbe920","Type":"ContainerDied","Data":"a4f6d9eb93ef4f9ebb0f616d6278997bad926cfcb7d4622947e1e7543decb0cb"} Apr 16 18:59:01.259888 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.259811 2578 scope.go:117] "RemoveContainer" containerID="a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d" Apr 16 18:59:01.267660 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.267646 2578 scope.go:117] "RemoveContainer" containerID="51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10" Apr 16 18:59:01.274661 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.274645 2578 scope.go:117] "RemoveContainer" containerID="a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d" Apr 16 18:59:01.274901 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:59:01.274881 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d\": container with ID starting with a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d not found: ID does not exist" containerID="a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d" Apr 16 18:59:01.274951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.274910 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d"} err="failed to get container status \"a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d\": rpc error: code = NotFound desc = could not find container \"a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d\": container with ID starting with a3cdad1b1b5a9d007f70b3516a4590d3161a38ad93e2d946954e6896b0ccfc7d not found: ID does not exist" Apr 16 18:59:01.274951 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.274928 2578 scope.go:117] "RemoveContainer" containerID="51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10" Apr 16 18:59:01.275134 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:59:01.275117 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10\": container with ID starting with 51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10 not found: ID does not exist" containerID="51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10" Apr 16 18:59:01.275178 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.275139 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10"} err="failed to get container status \"51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10\": rpc error: code = NotFound desc = could not find container \"51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10\": container with ID starting with 51b59a5ff233b1792201d29d62e0b5fb405a739fc583f12b538fc3c3ebab3d10 not found: ID does not exist" Apr 16 18:59:01.279139 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.279119 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c"] Apr 16 18:59:01.285416 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.285397 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-g5c7c"] Apr 16 18:59:01.648246 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:01.648138 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21812bb9-8c58-430b-92da-af2694fbe920" path="/var/lib/kubelet/pods/21812bb9-8c58-430b-92da-af2694fbe920/volumes" Apr 16 18:59:02.269531 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:02.269497 2578 generic.go:358] "Generic (PLEG): container finished" podID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerID="b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f" exitCode=0 Apr 16 18:59:02.269970 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:02.269570 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" event={"ID":"54d7c4e1-26b9-4569-962b-d75c5c565026","Type":"ContainerDied","Data":"b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f"} Apr 16 18:59:03.274592 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:03.274556 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" event={"ID":"54d7c4e1-26b9-4569-962b-d75c5c565026","Type":"ContainerStarted","Data":"3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74"} Apr 16 18:59:03.275030 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:03.274784 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" Apr 16 18:59:03.292053 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:03.291999 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" podStartSLOduration=6.291983018 podStartE2EDuration="6.291983018s" podCreationTimestamp="2026-04-16 18:58:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:59:03.289584339 +0000 UTC m=+3338.233197406" watchObservedRunningTime="2026-04-16 18:59:03.291983018 +0000 UTC m=+3338.235596073" Apr 16 18:59:34.340934 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:34.340890 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" podUID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 18:59:44.280204 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:44.280172 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" Apr 16 18:59:47.485532 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.485494 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2"] Apr 16 18:59:47.486013 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.485765 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" podUID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerName="kserve-container" containerID="cri-o://3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74" gracePeriod=30 Apr 16 18:59:47.559934 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.559898 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj"] Apr 16 18:59:47.560295 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.560279 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="storage-initializer" Apr 16 18:59:47.560295 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.560296 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="storage-initializer" Apr 16 18:59:47.560392 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.560309 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" Apr 16 18:59:47.560392 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.560315 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" Apr 16 18:59:47.560392 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.560368 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="21812bb9-8c58-430b-92da-af2694fbe920" containerName="kserve-container" Apr 16 18:59:47.564212 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.564152 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" Apr 16 18:59:47.573626 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.573597 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj"] Apr 16 18:59:47.644372 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.644333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d896610-810a-43bf-9311-9a1d69b388bf-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-8gzgj\" (UID: \"7d896610-810a-43bf-9311-9a1d69b388bf\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" Apr 16 18:59:47.745542 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.745452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d896610-810a-43bf-9311-9a1d69b388bf-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-8gzgj\" (UID: \"7d896610-810a-43bf-9311-9a1d69b388bf\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" Apr 16 18:59:47.745820 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.745802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d896610-810a-43bf-9311-9a1d69b388bf-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-8gzgj\" (UID: \"7d896610-810a-43bf-9311-9a1d69b388bf\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" Apr 16 18:59:47.875667 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.875633 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" Apr 16 18:59:47.994350 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:47.994318 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj"] Apr 16 18:59:47.999307 ip-10-0-139-88 kubenswrapper[2578]: W0416 18:59:47.999275 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d896610_810a_43bf_9311_9a1d69b388bf.slice/crio-917926b586762ab598c95abf2c5cb1074d1f317fbf8cb433961ced6306405b08 WatchSource:0}: Error finding container 917926b586762ab598c95abf2c5cb1074d1f317fbf8cb433961ced6306405b08: Status 404 returned error can't find the container with id 917926b586762ab598c95abf2c5cb1074d1f317fbf8cb433961ced6306405b08 Apr 16 18:59:48.409866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:48.409780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" event={"ID":"7d896610-810a-43bf-9311-9a1d69b388bf","Type":"ContainerStarted","Data":"9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c"} Apr 16 18:59:48.409866 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:48.409814 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" event={"ID":"7d896610-810a-43bf-9311-9a1d69b388bf","Type":"ContainerStarted","Data":"917926b586762ab598c95abf2c5cb1074d1f317fbf8cb433961ced6306405b08"} Apr 16 18:59:52.430150 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:52.430117 2578 generic.go:358] "Generic (PLEG): container finished" podID="7d896610-810a-43bf-9311-9a1d69b388bf" containerID="9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c" exitCode=0 Apr 16 18:59:52.430646 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:52.430172 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" event={"ID":"7d896610-810a-43bf-9311-9a1d69b388bf","Type":"ContainerDied","Data":"9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c"} Apr 16 18:59:53.435337 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:53.435301 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" event={"ID":"7d896610-810a-43bf-9311-9a1d69b388bf","Type":"ContainerStarted","Data":"58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e"} Apr 16 18:59:53.435711 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:53.435607 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" Apr 16 18:59:53.437019 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:53.436994 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 18:59:53.451677 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:53.451628 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" podStartSLOduration=6.451612077 podStartE2EDuration="6.451612077s" podCreationTimestamp="2026-04-16 18:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:59:53.449576634 +0000 UTC m=+3388.393189694" watchObservedRunningTime="2026-04-16 18:59:53.451612077 +0000 UTC m=+3388.395225110" Apr 16 18:59:54.278431 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:54.278388 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" podUID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.62:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.62:8080: connect: connection refused" Apr 16 18:59:54.439136 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:54.439099 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 18:59:54.929501 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:54.929479 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" Apr 16 18:59:55.006808 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.006730 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54d7c4e1-26b9-4569-962b-d75c5c565026-kserve-provision-location\") pod \"54d7c4e1-26b9-4569-962b-d75c5c565026\" (UID: \"54d7c4e1-26b9-4569-962b-d75c5c565026\") " Apr 16 18:59:55.007057 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.007033 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d7c4e1-26b9-4569-962b-d75c5c565026-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "54d7c4e1-26b9-4569-962b-d75c5c565026" (UID: "54d7c4e1-26b9-4569-962b-d75c5c565026"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:55.108066 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.108029 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54d7c4e1-26b9-4569-962b-d75c5c565026-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 18:59:55.443253 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.443164 2578 generic.go:358] "Generic (PLEG): container finished" podID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerID="3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74" exitCode=0 Apr 16 18:59:55.443596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.443256 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" Apr 16 18:59:55.443596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.443251 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" event={"ID":"54d7c4e1-26b9-4569-962b-d75c5c565026","Type":"ContainerDied","Data":"3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74"} Apr 16 18:59:55.443596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.443373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2" event={"ID":"54d7c4e1-26b9-4569-962b-d75c5c565026","Type":"ContainerDied","Data":"8cf2f7c1f62bca1f9bb17dccf7b52e4f078ab611595eaa988fcda684eab67449"} Apr 16 18:59:55.443596 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.443393 2578 scope.go:117] "RemoveContainer" containerID="3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74" Apr 16 18:59:55.452286 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.452265 2578 scope.go:117] "RemoveContainer" containerID="b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f" Apr 16 18:59:55.459814 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.459796 2578 scope.go:117] "RemoveContainer" containerID="3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74" Apr 16 18:59:55.460052 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:59:55.460032 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74\": container with ID starting with 3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74 not found: ID does not exist" containerID="3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74" Apr 16 18:59:55.460128 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.460061 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74"} err="failed to get container status \"3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74\": rpc error: code = NotFound desc = could not find container \"3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74\": container with ID starting with 3b8ab32173fe4d9ced17ba670ff0331af485a3792c5e9e2a6278f84691ccbd74 not found: ID does not exist" Apr 16 18:59:55.460128 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.460079 2578 scope.go:117] "RemoveContainer" containerID="b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f" Apr 16 18:59:55.460354 ip-10-0-139-88 kubenswrapper[2578]: E0416 18:59:55.460337 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f\": container with ID starting with b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f not found: ID does not exist" containerID="b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f" Apr 16 18:59:55.460414 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.460358 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f"} err="failed to get container status \"b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f\": rpc error: code = NotFound desc = could not find container \"b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f\": container with ID starting with b8218aa98fb85cee7b55a9cdb973630c45b9e558a4fb522621941f2c421d031f not found: ID does not exist" Apr 16 18:59:55.467378 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.467339 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2"] Apr 16 18:59:55.468792 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.468768 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-v4qj2"] Apr 16 18:59:55.647589 ip-10-0-139-88 kubenswrapper[2578]: I0416 18:59:55.647549 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d7c4e1-26b9-4569-962b-d75c5c565026" path="/var/lib/kubelet/pods/54d7c4e1-26b9-4569-962b-d75c5c565026/volumes" Apr 16 19:00:04.439165 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:04.439114 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:00:14.439249 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:14.439151 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:00:24.439469 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:24.439418 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:00:34.440072 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:34.440026 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:00:44.439274 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:44.439227 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.63:8080: connect: connection refused" Apr 16 19:00:54.439931 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:54.439897 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" Apr 16 19:00:57.680008 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.679977 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj"] Apr 16 19:00:57.680498 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.680242 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" containerID="cri-o://58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e" gracePeriod=30 Apr 16 19:00:57.756998 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.756963 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt"] Apr 16 19:00:57.757321 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.757309 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerName="kserve-container" Apr 16 19:00:57.757372 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.757322 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerName="kserve-container" Apr 16 19:00:57.757372 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.757333 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerName="storage-initializer" Apr 16 19:00:57.757372 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.757339 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerName="storage-initializer" Apr 16 19:00:57.757471 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.757390 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="54d7c4e1-26b9-4569-962b-d75c5c565026" containerName="kserve-container" Apr 16 19:00:57.761589 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.761572 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" Apr 16 19:00:57.763769 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.763747 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 19:00:57.773564 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.773544 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt"] Apr 16 19:00:57.843247 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.843217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/992d8ad0-662a-47b2-9ba8-3b1ca031b3e4-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt\" (UID: \"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" Apr 16 19:00:57.944481 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.944385 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/992d8ad0-662a-47b2-9ba8-3b1ca031b3e4-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt\" (UID: \"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" Apr 16 19:00:57.944770 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:57.944751 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/992d8ad0-662a-47b2-9ba8-3b1ca031b3e4-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt\" (UID: \"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" Apr 16 19:00:58.071517 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:58.071488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" Apr 16 19:00:58.194254 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:58.194224 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt"] Apr 16 19:00:58.198567 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:00:58.198538 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod992d8ad0_662a_47b2_9ba8_3b1ca031b3e4.slice/crio-423c1e7bd2867fd5501bb984a50538011ade00eeae88a7a69826731520dcad2e WatchSource:0}: Error finding container 423c1e7bd2867fd5501bb984a50538011ade00eeae88a7a69826731520dcad2e: Status 404 returned error can't find the container with id 423c1e7bd2867fd5501bb984a50538011ade00eeae88a7a69826731520dcad2e Apr 16 19:00:58.642857 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:58.642818 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" event={"ID":"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4","Type":"ContainerStarted","Data":"e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28"} Apr 16 19:00:58.642857 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:58.642858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" event={"ID":"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4","Type":"ContainerStarted","Data":"423c1e7bd2867fd5501bb984a50538011ade00eeae88a7a69826731520dcad2e"} Apr 16 19:00:59.647674 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:59.647641 2578 generic.go:358] "Generic (PLEG): container finished" podID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerID="e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28" exitCode=0 Apr 16 19:00:59.648139 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:00:59.647895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" event={"ID":"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4","Type":"ContainerDied","Data":"e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28"} Apr 16 19:01:00.652322 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:00.652288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" event={"ID":"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4","Type":"ContainerStarted","Data":"4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0"} Apr 16 19:01:00.652706 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:00.652476 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" Apr 16 19:01:00.653691 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:00.653666 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 19:01:00.674851 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:00.674808 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podStartSLOduration=3.6747948470000003 podStartE2EDuration="3.674794847s" podCreationTimestamp="2026-04-16 19:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:01:00.673150973 +0000 UTC m=+3455.616764029" watchObservedRunningTime="2026-04-16 19:01:00.674794847 +0000 UTC m=+3455.618407902" Apr 16 19:01:01.421691 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.421671 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" Apr 16 19:01:01.476431 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.476398 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d896610-810a-43bf-9311-9a1d69b388bf-kserve-provision-location\") pod \"7d896610-810a-43bf-9311-9a1d69b388bf\" (UID: \"7d896610-810a-43bf-9311-9a1d69b388bf\") " Apr 16 19:01:01.476716 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.476695 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d896610-810a-43bf-9311-9a1d69b388bf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7d896610-810a-43bf-9311-9a1d69b388bf" (UID: "7d896610-810a-43bf-9311-9a1d69b388bf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:01:01.577734 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.577654 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d896610-810a-43bf-9311-9a1d69b388bf-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:01:01.657118 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.657092 2578 generic.go:358] "Generic (PLEG): container finished" podID="7d896610-810a-43bf-9311-9a1d69b388bf" containerID="58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e" exitCode=0 Apr 16 19:01:01.657512 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.657152 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" Apr 16 19:01:01.657512 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.657168 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" event={"ID":"7d896610-810a-43bf-9311-9a1d69b388bf","Type":"ContainerDied","Data":"58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e"} Apr 16 19:01:01.657512 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.657216 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj" event={"ID":"7d896610-810a-43bf-9311-9a1d69b388bf","Type":"ContainerDied","Data":"917926b586762ab598c95abf2c5cb1074d1f317fbf8cb433961ced6306405b08"} Apr 16 19:01:01.657512 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.657236 2578 scope.go:117] "RemoveContainer" containerID="58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e" Apr 16 19:01:01.657707 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.657624 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 19:01:01.664893 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.664881 2578 scope.go:117] "RemoveContainer" containerID="9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c" Apr 16 19:01:01.671817 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.671801 2578 scope.go:117] "RemoveContainer" containerID="58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e" Apr 16 19:01:01.672077 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:01:01.672058 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e\": container with ID starting with 58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e not found: ID does not exist" containerID="58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e" Apr 16 19:01:01.672149 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.672091 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e"} err="failed to get container status \"58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e\": rpc error: code = NotFound desc = could not find container \"58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e\": container with ID starting with 58340c7dcc02ff07efb12f3499f2f9bf6fd1e1dd99e0fe9023d3bec396211b9e not found: ID does not exist" Apr 16 19:01:01.672149 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.672116 2578 scope.go:117] "RemoveContainer" containerID="9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c" Apr 16 19:01:01.672392 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:01:01.672374 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c\": container with ID starting with 9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c not found: ID does not exist" containerID="9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c" Apr 16 19:01:01.672447 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.672397 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c"} err="failed to get container status \"9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c\": rpc error: code = NotFound desc = could not find container \"9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c\": container with ID starting with 9317778a9248595e15a619088e8f38f2a0318984689074d537a02d21fb93c92c not found: ID does not exist" Apr 16 19:01:01.675948 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.675929 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj"] Apr 16 19:01:01.679878 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:01.679858 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-8gzgj"] Apr 16 19:01:03.647372 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:03.647340 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" path="/var/lib/kubelet/pods/7d896610-810a-43bf-9311-9a1d69b388bf/volumes" Apr 16 19:01:11.658222 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:11.658168 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 19:01:21.657702 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:21.657660 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 19:01:31.657624 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:31.657586 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 19:01:41.657582 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:41.657502 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 19:01:51.658436 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:01:51.658398 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 19:02:01.657846 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:01.657806 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 19:02:11.648077 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:11.648050 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" Apr 16 19:02:17.918328 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:17.918298 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt"] Apr 16 19:02:17.918831 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:17.918546 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" containerID="cri-o://4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0" gracePeriod=30 Apr 16 19:02:18.095641 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.095601 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd"] Apr 16 19:02:18.096040 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.096024 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" Apr 16 19:02:18.096129 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.096042 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" Apr 16 19:02:18.096129 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.096092 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="storage-initializer" Apr 16 19:02:18.096129 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.096101 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="storage-initializer" Apr 16 19:02:18.096325 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.096180 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d896610-810a-43bf-9311-9a1d69b388bf" containerName="kserve-container" Apr 16 19:02:18.099204 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.099171 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:02:18.101737 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.101718 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:02:18.113748 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.113726 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd"] Apr 16 19:02:18.239778 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.239751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/11e0614c-b460-45b7-aa54-bdf2486f27dc-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd\" (UID: \"11e0614c-b460-45b7-aa54-bdf2486f27dc\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:02:18.239973 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.239790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e0614c-b460-45b7-aa54-bdf2486f27dc-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd\" (UID: \"11e0614c-b460-45b7-aa54-bdf2486f27dc\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:02:18.341097 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.341066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/11e0614c-b460-45b7-aa54-bdf2486f27dc-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd\" (UID: \"11e0614c-b460-45b7-aa54-bdf2486f27dc\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:02:18.341277 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.341112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e0614c-b460-45b7-aa54-bdf2486f27dc-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd\" (UID: \"11e0614c-b460-45b7-aa54-bdf2486f27dc\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:02:18.341478 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.341463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e0614c-b460-45b7-aa54-bdf2486f27dc-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd\" (UID: \"11e0614c-b460-45b7-aa54-bdf2486f27dc\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:02:18.341817 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.341794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/11e0614c-b460-45b7-aa54-bdf2486f27dc-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd\" (UID: \"11e0614c-b460-45b7-aa54-bdf2486f27dc\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:02:18.409288 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.409246 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:02:18.534395 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.534371 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd"] Apr 16 19:02:18.537009 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:02:18.536982 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11e0614c_b460_45b7_aa54_bdf2486f27dc.slice/crio-14ed05d7cf996503b83fc441955ad63d7923322ae2900ab82b9dbdbc143cb0e3 WatchSource:0}: Error finding container 14ed05d7cf996503b83fc441955ad63d7923322ae2900ab82b9dbdbc143cb0e3: Status 404 returned error can't find the container with id 14ed05d7cf996503b83fc441955ad63d7923322ae2900ab82b9dbdbc143cb0e3 Apr 16 19:02:18.539299 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.539279 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:02:18.887132 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.887042 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" event={"ID":"11e0614c-b460-45b7-aa54-bdf2486f27dc","Type":"ContainerStarted","Data":"c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f"} Apr 16 19:02:18.887132 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:18.887083 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" event={"ID":"11e0614c-b460-45b7-aa54-bdf2486f27dc","Type":"ContainerStarted","Data":"14ed05d7cf996503b83fc441955ad63d7923322ae2900ab82b9dbdbc143cb0e3"} Apr 16 19:02:19.891928 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:19.891839 2578 generic.go:358] "Generic (PLEG): container finished" podID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerID="c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f" exitCode=0 Apr 16 19:02:19.892319 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:19.891926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" event={"ID":"11e0614c-b460-45b7-aa54-bdf2486f27dc","Type":"ContainerDied","Data":"c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f"} Apr 16 19:02:20.897031 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:20.897000 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" event={"ID":"11e0614c-b460-45b7-aa54-bdf2486f27dc","Type":"ContainerStarted","Data":"e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9"} Apr 16 19:02:20.897438 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:20.897226 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:02:20.898484 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:20.898462 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:02:20.913733 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:20.913689 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podStartSLOduration=2.913675787 podStartE2EDuration="2.913675787s" podCreationTimestamp="2026-04-16 19:02:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:02:20.913388609 +0000 UTC m=+3535.857001664" watchObservedRunningTime="2026-04-16 19:02:20.913675787 +0000 UTC m=+3535.857288843" Apr 16 19:02:21.644984 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:21.644937 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 16 19:02:21.900139 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:21.900051 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:02:22.360373 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.360346 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" Apr 16 19:02:22.376428 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.376397 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/992d8ad0-662a-47b2-9ba8-3b1ca031b3e4-kserve-provision-location\") pod \"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4\" (UID: \"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4\") " Apr 16 19:02:22.376720 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.376692 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/992d8ad0-662a-47b2-9ba8-3b1ca031b3e4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" (UID: "992d8ad0-662a-47b2-9ba8-3b1ca031b3e4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:02:22.477707 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.477678 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/992d8ad0-662a-47b2-9ba8-3b1ca031b3e4-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:02:22.904325 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.904243 2578 generic.go:358] "Generic (PLEG): container finished" podID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerID="4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0" exitCode=0 Apr 16 19:02:22.904325 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.904320 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" Apr 16 19:02:22.904757 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.904325 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" event={"ID":"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4","Type":"ContainerDied","Data":"4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0"} Apr 16 19:02:22.904757 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.904363 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt" event={"ID":"992d8ad0-662a-47b2-9ba8-3b1ca031b3e4","Type":"ContainerDied","Data":"423c1e7bd2867fd5501bb984a50538011ade00eeae88a7a69826731520dcad2e"} Apr 16 19:02:22.904757 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.904379 2578 scope.go:117] "RemoveContainer" containerID="4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0" Apr 16 19:02:22.912293 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.912277 2578 scope.go:117] "RemoveContainer" containerID="e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28" Apr 16 19:02:22.919228 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.919212 2578 scope.go:117] "RemoveContainer" containerID="4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0" Apr 16 19:02:22.919469 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:02:22.919450 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0\": container with ID starting with 4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0 not found: ID does not exist" containerID="4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0" Apr 16 19:02:22.919560 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.919477 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0"} err="failed to get container status \"4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0\": rpc error: code = NotFound desc = could not find container \"4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0\": container with ID starting with 4153883fab108509f855150b58ae1bccac7a488d9fa8318a2fbda5372108cfa0 not found: ID does not exist" Apr 16 19:02:22.919560 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.919494 2578 scope.go:117] "RemoveContainer" containerID="e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28" Apr 16 19:02:22.919731 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:02:22.919715 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28\": container with ID starting with e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28 not found: ID does not exist" containerID="e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28" Apr 16 19:02:22.919778 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.919735 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28"} err="failed to get container status \"e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28\": rpc error: code = NotFound desc = could not find container \"e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28\": container with ID starting with e0195458c00d428d1c6676fbf84b8cf5e07e1d5aaa97a4023b1bab02cbb7bc28 not found: ID does not exist" Apr 16 19:02:22.926249 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.926215 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt"] Apr 16 19:02:22.930587 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:22.930569 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-6bcf4b75b5-kgkrt"] Apr 16 19:02:23.652979 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:23.652942 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" path="/var/lib/kubelet/pods/992d8ad0-662a-47b2-9ba8-3b1ca031b3e4/volumes" Apr 16 19:02:31.900437 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:31.900391 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:02:41.900439 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:41.900392 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:02:51.900385 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:02:51.900343 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:03:01.900383 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:01.900338 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:03:11.900845 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:11.900749 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:03:21.900407 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:21.900362 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:03:31.900994 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:31.900954 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:03:38.068557 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:38.068508 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd"] Apr 16 19:03:38.068981 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:38.068783 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" containerID="cri-o://e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9" gracePeriod=30 Apr 16 19:03:39.117254 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.117221 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5"] Apr 16 19:03:39.117641 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.117547 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="storage-initializer" Apr 16 19:03:39.117641 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.117558 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="storage-initializer" Apr 16 19:03:39.117641 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.117569 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" Apr 16 19:03:39.117641 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.117575 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" Apr 16 19:03:39.117641 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.117638 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="992d8ad0-662a-47b2-9ba8-3b1ca031b3e4" containerName="kserve-container" Apr 16 19:03:39.120483 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.120465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" Apr 16 19:03:39.141980 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.141951 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5"] Apr 16 19:03:39.272570 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.272530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22810edb-7a44-456a-bd85-e60e7d080b0f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5\" (UID: \"22810edb-7a44-456a-bd85-e60e7d080b0f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" Apr 16 19:03:39.374020 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.373929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22810edb-7a44-456a-bd85-e60e7d080b0f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5\" (UID: \"22810edb-7a44-456a-bd85-e60e7d080b0f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" Apr 16 19:03:39.374382 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.374357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22810edb-7a44-456a-bd85-e60e7d080b0f-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5\" (UID: \"22810edb-7a44-456a-bd85-e60e7d080b0f\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" Apr 16 19:03:39.430318 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.430286 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" Apr 16 19:03:39.551559 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:39.551532 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5"] Apr 16 19:03:39.554186 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:03:39.554155 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22810edb_7a44_456a_bd85_e60e7d080b0f.slice/crio-fc5086077fbd930d8de2d8aef3227272ac8ad1409a30c021090b29c1edf8ab40 WatchSource:0}: Error finding container fc5086077fbd930d8de2d8aef3227272ac8ad1409a30c021090b29c1edf8ab40: Status 404 returned error can't find the container with id fc5086077fbd930d8de2d8aef3227272ac8ad1409a30c021090b29c1edf8ab40 Apr 16 19:03:40.135581 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:40.135544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" event={"ID":"22810edb-7a44-456a-bd85-e60e7d080b0f","Type":"ContainerStarted","Data":"674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168"} Apr 16 19:03:40.135581 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:40.135583 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" event={"ID":"22810edb-7a44-456a-bd85-e60e7d080b0f","Type":"ContainerStarted","Data":"fc5086077fbd930d8de2d8aef3227272ac8ad1409a30c021090b29c1edf8ab40"} Apr 16 19:03:41.900272 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:41.900221 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 16 19:03:42.309399 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:42.309377 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:03:42.400667 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:42.400633 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e0614c-b460-45b7-aa54-bdf2486f27dc-kserve-provision-location\") pod \"11e0614c-b460-45b7-aa54-bdf2486f27dc\" (UID: \"11e0614c-b460-45b7-aa54-bdf2486f27dc\") " Apr 16 19:03:42.400866 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:42.400705 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/11e0614c-b460-45b7-aa54-bdf2486f27dc-cabundle-cert\") pod \"11e0614c-b460-45b7-aa54-bdf2486f27dc\" (UID: \"11e0614c-b460-45b7-aa54-bdf2486f27dc\") " Apr 16 19:03:42.400989 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:42.400960 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11e0614c-b460-45b7-aa54-bdf2486f27dc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "11e0614c-b460-45b7-aa54-bdf2486f27dc" (UID: "11e0614c-b460-45b7-aa54-bdf2486f27dc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:03:42.401097 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:42.401061 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e0614c-b460-45b7-aa54-bdf2486f27dc-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "11e0614c-b460-45b7-aa54-bdf2486f27dc" (UID: "11e0614c-b460-45b7-aa54-bdf2486f27dc"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:03:42.502156 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:42.502131 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11e0614c-b460-45b7-aa54-bdf2486f27dc-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:03:42.502156 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:42.502156 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/11e0614c-b460-45b7-aa54-bdf2486f27dc-cabundle-cert\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:03:43.146296 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.146258 2578 generic.go:358] "Generic (PLEG): container finished" podID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerID="e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9" exitCode=0 Apr 16 19:03:43.146733 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.146323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" event={"ID":"11e0614c-b460-45b7-aa54-bdf2486f27dc","Type":"ContainerDied","Data":"e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9"} Apr 16 19:03:43.146733 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.146348 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" Apr 16 19:03:43.146733 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.146356 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd" event={"ID":"11e0614c-b460-45b7-aa54-bdf2486f27dc","Type":"ContainerDied","Data":"14ed05d7cf996503b83fc441955ad63d7923322ae2900ab82b9dbdbc143cb0e3"} Apr 16 19:03:43.146733 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.146375 2578 scope.go:117] "RemoveContainer" containerID="e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9" Apr 16 19:03:43.158773 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.158488 2578 scope.go:117] "RemoveContainer" containerID="c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f" Apr 16 19:03:43.166047 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.166022 2578 scope.go:117] "RemoveContainer" containerID="e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9" Apr 16 19:03:43.166429 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:03:43.166386 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9\": container with ID starting with e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9 not found: ID does not exist" containerID="e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9" Apr 16 19:03:43.166527 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.166430 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9"} err="failed to get container status \"e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9\": rpc error: code = NotFound desc = could not find container \"e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9\": container with ID starting with e0632a31c86fe9cf4a502b4459c86e3ac5d25afe3b00e5df38267319cb6fcdf9 not found: ID does not exist" Apr 16 19:03:43.166527 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.166462 2578 scope.go:117] "RemoveContainer" containerID="c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f" Apr 16 19:03:43.166794 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:03:43.166762 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f\": container with ID starting with c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f not found: ID does not exist" containerID="c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f" Apr 16 19:03:43.166857 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.166819 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f"} err="failed to get container status \"c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f\": rpc error: code = NotFound desc = could not find container \"c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f\": container with ID starting with c5ac5d488fdbcddfad19bb0e5a98f8bda33a6843071c520a5ce2282b8113299f not found: ID does not exist" Apr 16 19:03:43.187553 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.187526 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd"] Apr 16 19:03:43.192098 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.192077 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-648ccfcccc-4qljd"] Apr 16 19:03:43.650961 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:43.650931 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" path="/var/lib/kubelet/pods/11e0614c-b460-45b7-aa54-bdf2486f27dc/volumes" Apr 16 19:03:46.158585 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:46.158553 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5_22810edb-7a44-456a-bd85-e60e7d080b0f/storage-initializer/0.log" Apr 16 19:03:46.158585 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:46.158588 2578 generic.go:358] "Generic (PLEG): container finished" podID="22810edb-7a44-456a-bd85-e60e7d080b0f" containerID="674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168" exitCode=1 Apr 16 19:03:46.159041 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:46.158635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" event={"ID":"22810edb-7a44-456a-bd85-e60e7d080b0f","Type":"ContainerDied","Data":"674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168"} Apr 16 19:03:47.163574 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:47.163544 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5_22810edb-7a44-456a-bd85-e60e7d080b0f/storage-initializer/0.log" Apr 16 19:03:47.163965 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:47.163600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" event={"ID":"22810edb-7a44-456a-bd85-e60e7d080b0f","Type":"ContainerStarted","Data":"f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83"} Apr 16 19:03:49.148166 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:49.148134 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5"] Apr 16 19:03:49.148597 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:49.148384 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" podUID="22810edb-7a44-456a-bd85-e60e7d080b0f" containerName="storage-initializer" containerID="cri-o://f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83" gracePeriod=30 Apr 16 19:03:50.220212 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.220167 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh"] Apr 16 19:03:50.220586 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.220507 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="storage-initializer" Apr 16 19:03:50.220586 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.220519 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="storage-initializer" Apr 16 19:03:50.220586 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.220545 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" Apr 16 19:03:50.220586 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.220551 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" Apr 16 19:03:50.220719 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.220606 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="11e0614c-b460-45b7-aa54-bdf2486f27dc" containerName="kserve-container" Apr 16 19:03:50.223798 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.223783 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:03:50.225763 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.225740 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:03:50.239553 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.239524 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh"] Apr 16 19:03:50.264583 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.264541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0f24903c-e578-42be-b401-e71c6d39b6a7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh\" (UID: \"0f24903c-e578-42be-b401-e71c6d39b6a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:03:50.264793 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.264592 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f24903c-e578-42be-b401-e71c6d39b6a7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh\" (UID: \"0f24903c-e578-42be-b401-e71c6d39b6a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:03:50.365854 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.365828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0f24903c-e578-42be-b401-e71c6d39b6a7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh\" (UID: \"0f24903c-e578-42be-b401-e71c6d39b6a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:03:50.365976 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.365883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f24903c-e578-42be-b401-e71c6d39b6a7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh\" (UID: \"0f24903c-e578-42be-b401-e71c6d39b6a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:03:50.366227 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.366211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f24903c-e578-42be-b401-e71c6d39b6a7-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh\" (UID: \"0f24903c-e578-42be-b401-e71c6d39b6a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:03:50.366613 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.366592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0f24903c-e578-42be-b401-e71c6d39b6a7-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh\" (UID: \"0f24903c-e578-42be-b401-e71c6d39b6a7\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:03:50.479391 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.479333 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5_22810edb-7a44-456a-bd85-e60e7d080b0f/storage-initializer/1.log" Apr 16 19:03:50.479676 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.479657 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5_22810edb-7a44-456a-bd85-e60e7d080b0f/storage-initializer/0.log" Apr 16 19:03:50.479771 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.479722 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" Apr 16 19:03:50.534041 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.534014 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:03:50.567795 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.567768 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22810edb-7a44-456a-bd85-e60e7d080b0f-kserve-provision-location\") pod \"22810edb-7a44-456a-bd85-e60e7d080b0f\" (UID: \"22810edb-7a44-456a-bd85-e60e7d080b0f\") " Apr 16 19:03:50.568120 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.568082 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22810edb-7a44-456a-bd85-e60e7d080b0f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "22810edb-7a44-456a-bd85-e60e7d080b0f" (UID: "22810edb-7a44-456a-bd85-e60e7d080b0f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:03:50.655018 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.654877 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh"] Apr 16 19:03:50.657546 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:03:50.657523 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f24903c_e578_42be_b401_e71c6d39b6a7.slice/crio-2e7cbdd9463ebdde3065ba5a8a96b7aebd9d62bf3d2aff4952d85f2b64f18d92 WatchSource:0}: Error finding container 2e7cbdd9463ebdde3065ba5a8a96b7aebd9d62bf3d2aff4952d85f2b64f18d92: Status 404 returned error can't find the container with id 2e7cbdd9463ebdde3065ba5a8a96b7aebd9d62bf3d2aff4952d85f2b64f18d92 Apr 16 19:03:50.668340 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:50.668323 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22810edb-7a44-456a-bd85-e60e7d080b0f-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:03:51.182489 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.182455 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" event={"ID":"0f24903c-e578-42be-b401-e71c6d39b6a7","Type":"ContainerStarted","Data":"c8edf121f7ad62375f5aa3c0186b8f49a311af90443ee6d9d92bbaba702a573c"} Apr 16 19:03:51.182489 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.182492 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" event={"ID":"0f24903c-e578-42be-b401-e71c6d39b6a7","Type":"ContainerStarted","Data":"2e7cbdd9463ebdde3065ba5a8a96b7aebd9d62bf3d2aff4952d85f2b64f18d92"} Apr 16 19:03:51.183682 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.183662 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5_22810edb-7a44-456a-bd85-e60e7d080b0f/storage-initializer/1.log" Apr 16 19:03:51.183991 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.183975 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5_22810edb-7a44-456a-bd85-e60e7d080b0f/storage-initializer/0.log" Apr 16 19:03:51.184044 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.184008 2578 generic.go:358] "Generic (PLEG): container finished" podID="22810edb-7a44-456a-bd85-e60e7d080b0f" containerID="f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83" exitCode=1 Apr 16 19:03:51.184044 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.184035 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" event={"ID":"22810edb-7a44-456a-bd85-e60e7d080b0f","Type":"ContainerDied","Data":"f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83"} Apr 16 19:03:51.184121 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.184066 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" event={"ID":"22810edb-7a44-456a-bd85-e60e7d080b0f","Type":"ContainerDied","Data":"fc5086077fbd930d8de2d8aef3227272ac8ad1409a30c021090b29c1edf8ab40"} Apr 16 19:03:51.184121 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.184082 2578 scope.go:117] "RemoveContainer" containerID="f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83" Apr 16 19:03:51.184121 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.184087 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5" Apr 16 19:03:51.192920 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.192906 2578 scope.go:117] "RemoveContainer" containerID="674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168" Apr 16 19:03:51.199705 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.199689 2578 scope.go:117] "RemoveContainer" containerID="f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83" Apr 16 19:03:51.199963 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:03:51.199946 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83\": container with ID starting with f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83 not found: ID does not exist" containerID="f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83" Apr 16 19:03:51.200030 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.199974 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83"} err="failed to get container status \"f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83\": rpc error: code = NotFound desc = could not find container \"f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83\": container with ID starting with f967d97815c6258bacdbf47261151f43d16db5cdf402fbde89faecc6970fda83 not found: ID does not exist" Apr 16 19:03:51.200030 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.199997 2578 scope.go:117] "RemoveContainer" containerID="674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168" Apr 16 19:03:51.200259 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:03:51.200242 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168\": container with ID starting with 674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168 not found: ID does not exist" containerID="674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168" Apr 16 19:03:51.200306 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.200265 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168"} err="failed to get container status \"674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168\": rpc error: code = NotFound desc = could not find container \"674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168\": container with ID starting with 674011765bee04229e10a03d740484523d181beb1628ec0132dd0d6e41cef168 not found: ID does not exist" Apr 16 19:03:51.225636 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.225611 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5"] Apr 16 19:03:51.230834 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.230810 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69ddbbfd75-cztx5"] Apr 16 19:03:51.648531 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:51.648505 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22810edb-7a44-456a-bd85-e60e7d080b0f" path="/var/lib/kubelet/pods/22810edb-7a44-456a-bd85-e60e7d080b0f/volumes" Apr 16 19:03:52.188886 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:52.188857 2578 generic.go:358] "Generic (PLEG): container finished" podID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerID="c8edf121f7ad62375f5aa3c0186b8f49a311af90443ee6d9d92bbaba702a573c" exitCode=0 Apr 16 19:03:52.189117 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:52.188944 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" event={"ID":"0f24903c-e578-42be-b401-e71c6d39b6a7","Type":"ContainerDied","Data":"c8edf121f7ad62375f5aa3c0186b8f49a311af90443ee6d9d92bbaba702a573c"} Apr 16 19:03:53.193415 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:53.193385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" event={"ID":"0f24903c-e578-42be-b401-e71c6d39b6a7","Type":"ContainerStarted","Data":"b5c4d06a9a69d051da904be3b927eab90def4f31f5edb5d6d5c1c8539bcc617c"} Apr 16 19:03:53.193829 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:53.193545 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:03:53.194921 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:53.194895 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 19:03:53.211759 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:53.211713 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podStartSLOduration=3.211695544 podStartE2EDuration="3.211695544s" podCreationTimestamp="2026-04-16 19:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:03:53.209796178 +0000 UTC m=+3628.153409234" watchObservedRunningTime="2026-04-16 19:03:53.211695544 +0000 UTC m=+3628.155308601" Apr 16 19:03:54.196625 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:03:54.196587 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 19:04:04.197359 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:04:04.197316 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 19:04:14.196769 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:04:14.196721 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 19:04:24.197046 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:04:24.196997 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 19:04:34.197040 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:04:34.196998 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 19:04:44.197163 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:04:44.197109 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 19:04:54.196620 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:04:54.196576 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 19:05:04.198102 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:04.198069 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:05:10.230519 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:10.230476 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh"] Apr 16 19:05:10.231085 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:10.230834 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" containerID="cri-o://b5c4d06a9a69d051da904be3b927eab90def4f31f5edb5d6d5c1c8539bcc617c" gracePeriod=30 Apr 16 19:05:11.317596 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.317555 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc"] Apr 16 19:05:11.318060 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.318045 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22810edb-7a44-456a-bd85-e60e7d080b0f" containerName="storage-initializer" Apr 16 19:05:11.318131 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.318063 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="22810edb-7a44-456a-bd85-e60e7d080b0f" containerName="storage-initializer" Apr 16 19:05:11.318131 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.318087 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22810edb-7a44-456a-bd85-e60e7d080b0f" containerName="storage-initializer" Apr 16 19:05:11.318131 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.318096 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="22810edb-7a44-456a-bd85-e60e7d080b0f" containerName="storage-initializer" Apr 16 19:05:11.318333 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.318180 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="22810edb-7a44-456a-bd85-e60e7d080b0f" containerName="storage-initializer" Apr 16 19:05:11.318390 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.318379 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="22810edb-7a44-456a-bd85-e60e7d080b0f" containerName="storage-initializer" Apr 16 19:05:11.321257 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.321236 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" Apr 16 19:05:11.332583 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.332559 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc"] Apr 16 19:05:11.451981 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.451947 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f571c100-a59e-4537-bcbe-eda344080c7b-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc\" (UID: \"f571c100-a59e-4537-bcbe-eda344080c7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" Apr 16 19:05:11.553391 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.553350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f571c100-a59e-4537-bcbe-eda344080c7b-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc\" (UID: \"f571c100-a59e-4537-bcbe-eda344080c7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" Apr 16 19:05:11.553763 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.553737 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f571c100-a59e-4537-bcbe-eda344080c7b-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc\" (UID: \"f571c100-a59e-4537-bcbe-eda344080c7b\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" Apr 16 19:05:11.631547 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.631458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" Apr 16 19:05:11.753477 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:11.753416 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc"] Apr 16 19:05:11.757479 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:05:11.757453 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf571c100_a59e_4537_bcbe_eda344080c7b.slice/crio-c9a96b0222a4581ef6b36229cb6753fb5aa7f24befe1a3130d2035807dac98b0 WatchSource:0}: Error finding container c9a96b0222a4581ef6b36229cb6753fb5aa7f24befe1a3130d2035807dac98b0: Status 404 returned error can't find the container with id c9a96b0222a4581ef6b36229cb6753fb5aa7f24befe1a3130d2035807dac98b0 Apr 16 19:05:12.438353 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:12.438315 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" event={"ID":"f571c100-a59e-4537-bcbe-eda344080c7b","Type":"ContainerStarted","Data":"729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea"} Apr 16 19:05:12.438353 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:12.438358 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" event={"ID":"f571c100-a59e-4537-bcbe-eda344080c7b","Type":"ContainerStarted","Data":"c9a96b0222a4581ef6b36229cb6753fb5aa7f24befe1a3130d2035807dac98b0"} Apr 16 19:05:14.196661 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.196620 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.67:8080: connect: connection refused" Apr 16 19:05:14.446299 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.446270 2578 generic.go:358] "Generic (PLEG): container finished" podID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerID="b5c4d06a9a69d051da904be3b927eab90def4f31f5edb5d6d5c1c8539bcc617c" exitCode=0 Apr 16 19:05:14.446448 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.446305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" event={"ID":"0f24903c-e578-42be-b401-e71c6d39b6a7","Type":"ContainerDied","Data":"b5c4d06a9a69d051da904be3b927eab90def4f31f5edb5d6d5c1c8539bcc617c"} Apr 16 19:05:14.570357 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.570335 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:05:14.679707 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.679671 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f24903c-e578-42be-b401-e71c6d39b6a7-kserve-provision-location\") pod \"0f24903c-e578-42be-b401-e71c6d39b6a7\" (UID: \"0f24903c-e578-42be-b401-e71c6d39b6a7\") " Apr 16 19:05:14.679915 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.679740 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0f24903c-e578-42be-b401-e71c6d39b6a7-cabundle-cert\") pod \"0f24903c-e578-42be-b401-e71c6d39b6a7\" (UID: \"0f24903c-e578-42be-b401-e71c6d39b6a7\") " Apr 16 19:05:14.680006 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.679981 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f24903c-e578-42be-b401-e71c6d39b6a7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0f24903c-e578-42be-b401-e71c6d39b6a7" (UID: "0f24903c-e578-42be-b401-e71c6d39b6a7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:05:14.680083 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.680061 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f24903c-e578-42be-b401-e71c6d39b6a7-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "0f24903c-e578-42be-b401-e71c6d39b6a7" (UID: "0f24903c-e578-42be-b401-e71c6d39b6a7"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:05:14.780711 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.780672 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/0f24903c-e578-42be-b401-e71c6d39b6a7-cabundle-cert\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:05:14.780711 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:14.780702 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f24903c-e578-42be-b401-e71c6d39b6a7-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:05:15.450756 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:15.450723 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" event={"ID":"0f24903c-e578-42be-b401-e71c6d39b6a7","Type":"ContainerDied","Data":"2e7cbdd9463ebdde3065ba5a8a96b7aebd9d62bf3d2aff4952d85f2b64f18d92"} Apr 16 19:05:15.451180 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:15.450764 2578 scope.go:117] "RemoveContainer" containerID="b5c4d06a9a69d051da904be3b927eab90def4f31f5edb5d6d5c1c8539bcc617c" Apr 16 19:05:15.451180 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:15.450774 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh" Apr 16 19:05:15.458765 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:15.458730 2578 scope.go:117] "RemoveContainer" containerID="c8edf121f7ad62375f5aa3c0186b8f49a311af90443ee6d9d92bbaba702a573c" Apr 16 19:05:15.471952 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:15.471931 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh"] Apr 16 19:05:15.479892 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:15.479864 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7d7dbb7bfc-c4xmh"] Apr 16 19:05:15.647527 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:15.647486 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" path="/var/lib/kubelet/pods/0f24903c-e578-42be-b401-e71c6d39b6a7/volumes" Apr 16 19:05:17.460117 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:17.460094 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc_f571c100-a59e-4537-bcbe-eda344080c7b/storage-initializer/0.log" Apr 16 19:05:17.460497 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:17.460128 2578 generic.go:358] "Generic (PLEG): container finished" podID="f571c100-a59e-4537-bcbe-eda344080c7b" containerID="729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea" exitCode=1 Apr 16 19:05:17.460497 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:17.460224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" event={"ID":"f571c100-a59e-4537-bcbe-eda344080c7b","Type":"ContainerDied","Data":"729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea"} Apr 16 19:05:18.464887 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:18.464861 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc_f571c100-a59e-4537-bcbe-eda344080c7b/storage-initializer/0.log" Apr 16 19:05:18.465279 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:18.464946 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" event={"ID":"f571c100-a59e-4537-bcbe-eda344080c7b","Type":"ContainerStarted","Data":"6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86"} Apr 16 19:05:21.341435 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.341404 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc"] Apr 16 19:05:21.341783 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.341640 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" podUID="f571c100-a59e-4537-bcbe-eda344080c7b" containerName="storage-initializer" containerID="cri-o://6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86" gracePeriod=30 Apr 16 19:05:21.465184 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.465161 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc_f571c100-a59e-4537-bcbe-eda344080c7b/storage-initializer/1.log" Apr 16 19:05:21.465554 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.465539 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc_f571c100-a59e-4537-bcbe-eda344080c7b/storage-initializer/0.log" Apr 16 19:05:21.465642 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.465614 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" Apr 16 19:05:21.474468 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.474449 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc_f571c100-a59e-4537-bcbe-eda344080c7b/storage-initializer/1.log" Apr 16 19:05:21.474772 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.474759 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc_f571c100-a59e-4537-bcbe-eda344080c7b/storage-initializer/0.log" Apr 16 19:05:21.474843 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.474790 2578 generic.go:358] "Generic (PLEG): container finished" podID="f571c100-a59e-4537-bcbe-eda344080c7b" containerID="6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86" exitCode=1 Apr 16 19:05:21.474898 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.474866 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" Apr 16 19:05:21.474898 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.474874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" event={"ID":"f571c100-a59e-4537-bcbe-eda344080c7b","Type":"ContainerDied","Data":"6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86"} Apr 16 19:05:21.475008 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.474903 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc" event={"ID":"f571c100-a59e-4537-bcbe-eda344080c7b","Type":"ContainerDied","Data":"c9a96b0222a4581ef6b36229cb6753fb5aa7f24befe1a3130d2035807dac98b0"} Apr 16 19:05:21.475008 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.474918 2578 scope.go:117] "RemoveContainer" containerID="6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86" Apr 16 19:05:21.482253 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.482232 2578 scope.go:117] "RemoveContainer" containerID="729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea" Apr 16 19:05:21.489444 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.489429 2578 scope.go:117] "RemoveContainer" containerID="6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86" Apr 16 19:05:21.489687 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:05:21.489668 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86\": container with ID starting with 6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86 not found: ID does not exist" containerID="6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86" Apr 16 19:05:21.489758 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.489700 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86"} err="failed to get container status \"6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86\": rpc error: code = NotFound desc = could not find container \"6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86\": container with ID starting with 6c071e7720984e97e7e6b687f8f944147aec9499b40a0a4ab549daf49a56fe86 not found: ID does not exist" Apr 16 19:05:21.489758 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.489725 2578 scope.go:117] "RemoveContainer" containerID="729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea" Apr 16 19:05:21.489971 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:05:21.489952 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea\": container with ID starting with 729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea not found: ID does not exist" containerID="729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea" Apr 16 19:05:21.490061 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.489978 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea"} err="failed to get container status \"729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea\": rpc error: code = NotFound desc = could not find container \"729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea\": container with ID starting with 729f349edbda3bde483fe6dd11bbbc2cf32872478e6357553a48a915848553ea not found: ID does not exist" Apr 16 19:05:21.537606 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.537522 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f571c100-a59e-4537-bcbe-eda344080c7b-kserve-provision-location\") pod \"f571c100-a59e-4537-bcbe-eda344080c7b\" (UID: \"f571c100-a59e-4537-bcbe-eda344080c7b\") " Apr 16 19:05:21.537773 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.537751 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f571c100-a59e-4537-bcbe-eda344080c7b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f571c100-a59e-4537-bcbe-eda344080c7b" (UID: "f571c100-a59e-4537-bcbe-eda344080c7b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:05:21.639003 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.638971 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f571c100-a59e-4537-bcbe-eda344080c7b-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:05:21.804380 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.804301 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc"] Apr 16 19:05:21.808710 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:21.808687 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-b8cd84c6c-6cpsc"] Apr 16 19:05:22.403497 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403465 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86"] Apr 16 19:05:22.403888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403779 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f571c100-a59e-4537-bcbe-eda344080c7b" containerName="storage-initializer" Apr 16 19:05:22.403888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403791 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f571c100-a59e-4537-bcbe-eda344080c7b" containerName="storage-initializer" Apr 16 19:05:22.403888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403802 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f571c100-a59e-4537-bcbe-eda344080c7b" containerName="storage-initializer" Apr 16 19:05:22.403888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403807 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f571c100-a59e-4537-bcbe-eda344080c7b" containerName="storage-initializer" Apr 16 19:05:22.403888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403825 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="storage-initializer" Apr 16 19:05:22.403888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403831 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="storage-initializer" Apr 16 19:05:22.403888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403837 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" Apr 16 19:05:22.403888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403844 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" Apr 16 19:05:22.403888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403888 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f24903c-e578-42be-b401-e71c6d39b6a7" containerName="kserve-container" Apr 16 19:05:22.404183 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403899 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f571c100-a59e-4537-bcbe-eda344080c7b" containerName="storage-initializer" Apr 16 19:05:22.404183 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.403905 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f571c100-a59e-4537-bcbe-eda344080c7b" containerName="storage-initializer" Apr 16 19:05:22.408099 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.408084 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:05:22.409879 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.409861 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:05:22.410056 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.410037 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-89kqm\"" Apr 16 19:05:22.410342 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.410327 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 19:05:22.416792 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.416767 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86"] Apr 16 19:05:22.546487 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.546456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6eb3969c-3518-4145-90c3-b3a1056c2277-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86\" (UID: \"6eb3969c-3518-4145-90c3-b3a1056c2277\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:05:22.546649 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.546505 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6eb3969c-3518-4145-90c3-b3a1056c2277-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86\" (UID: \"6eb3969c-3518-4145-90c3-b3a1056c2277\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:05:22.647068 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.647020 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6eb3969c-3518-4145-90c3-b3a1056c2277-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86\" (UID: \"6eb3969c-3518-4145-90c3-b3a1056c2277\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:05:22.647234 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.647088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6eb3969c-3518-4145-90c3-b3a1056c2277-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86\" (UID: \"6eb3969c-3518-4145-90c3-b3a1056c2277\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:05:22.647501 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.647481 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6eb3969c-3518-4145-90c3-b3a1056c2277-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86\" (UID: \"6eb3969c-3518-4145-90c3-b3a1056c2277\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:05:22.647697 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.647681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6eb3969c-3518-4145-90c3-b3a1056c2277-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86\" (UID: \"6eb3969c-3518-4145-90c3-b3a1056c2277\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:05:22.718820 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.718791 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:05:22.839709 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:22.839681 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86"] Apr 16 19:05:22.843747 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:05:22.843709 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb3969c_3518_4145_90c3_b3a1056c2277.slice/crio-98718fec4c44c8014cdc0b6cb223f0c4faf7775c16cfd5aaf723422e5dc50bb3 WatchSource:0}: Error finding container 98718fec4c44c8014cdc0b6cb223f0c4faf7775c16cfd5aaf723422e5dc50bb3: Status 404 returned error can't find the container with id 98718fec4c44c8014cdc0b6cb223f0c4faf7775c16cfd5aaf723422e5dc50bb3 Apr 16 19:05:23.484986 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:23.484941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" event={"ID":"6eb3969c-3518-4145-90c3-b3a1056c2277","Type":"ContainerStarted","Data":"ea2fff5897f792ee295a16ceb13d77b698b02f8cae0666b01f86b8c90a0b3e67"} Apr 16 19:05:23.484986 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:23.484982 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" event={"ID":"6eb3969c-3518-4145-90c3-b3a1056c2277","Type":"ContainerStarted","Data":"98718fec4c44c8014cdc0b6cb223f0c4faf7775c16cfd5aaf723422e5dc50bb3"} Apr 16 19:05:23.648051 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:23.648011 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f571c100-a59e-4537-bcbe-eda344080c7b" path="/var/lib/kubelet/pods/f571c100-a59e-4537-bcbe-eda344080c7b/volumes" Apr 16 19:05:24.489730 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:24.489700 2578 generic.go:358] "Generic (PLEG): container finished" podID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerID="ea2fff5897f792ee295a16ceb13d77b698b02f8cae0666b01f86b8c90a0b3e67" exitCode=0 Apr 16 19:05:24.490131 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:24.489769 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" event={"ID":"6eb3969c-3518-4145-90c3-b3a1056c2277","Type":"ContainerDied","Data":"ea2fff5897f792ee295a16ceb13d77b698b02f8cae0666b01f86b8c90a0b3e67"} Apr 16 19:05:25.494840 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:25.494807 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" event={"ID":"6eb3969c-3518-4145-90c3-b3a1056c2277","Type":"ContainerStarted","Data":"bdb62ed7a015c2fdcd67c0a9f9531ecdc15243c44477caf781974956cb83209c"} Apr 16 19:05:25.495255 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:25.495028 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:05:25.496245 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:25.496219 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 19:05:25.513434 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:25.513388 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podStartSLOduration=3.513373466 podStartE2EDuration="3.513373466s" podCreationTimestamp="2026-04-16 19:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:05:25.512292022 +0000 UTC m=+3720.455905092" watchObservedRunningTime="2026-04-16 19:05:25.513373466 +0000 UTC m=+3720.456986539" Apr 16 19:05:26.498482 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:26.498444 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 19:05:36.499038 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:36.498995 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 19:05:46.498380 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:46.498335 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 19:05:56.499048 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:05:56.499002 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 19:06:06.498627 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:06.498529 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 19:06:16.499010 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:16.498965 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 19:06:26.499080 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:26.499029 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 19:06:36.500067 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:36.500036 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:06:42.443285 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:42.443251 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86"] Apr 16 19:06:42.443752 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:42.443620 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" containerID="cri-o://bdb62ed7a015c2fdcd67c0a9f9531ecdc15243c44477caf781974956cb83209c" gracePeriod=30 Apr 16 19:06:43.488003 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:43.487968 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7"] Apr 16 19:06:43.491289 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:43.491272 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" Apr 16 19:06:43.501460 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:43.501440 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7"] Apr 16 19:06:43.540836 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:43.540811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39b69521-47e9-4ea2-8e8e-365476f8a50d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7\" (UID: \"39b69521-47e9-4ea2-8e8e-365476f8a50d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" Apr 16 19:06:43.641250 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:43.641217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39b69521-47e9-4ea2-8e8e-365476f8a50d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7\" (UID: \"39b69521-47e9-4ea2-8e8e-365476f8a50d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" Apr 16 19:06:43.641594 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:43.641574 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39b69521-47e9-4ea2-8e8e-365476f8a50d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7\" (UID: \"39b69521-47e9-4ea2-8e8e-365476f8a50d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" Apr 16 19:06:43.802264 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:43.802165 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" Apr 16 19:06:43.925213 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:43.925169 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7"] Apr 16 19:06:43.929085 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:06:43.929052 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b69521_47e9_4ea2_8e8e_365476f8a50d.slice/crio-6c7074c03bc05f6e98690b0616dea3fbf3c2d104ed24f833bc856ff2fa3065f7 WatchSource:0}: Error finding container 6c7074c03bc05f6e98690b0616dea3fbf3c2d104ed24f833bc856ff2fa3065f7: Status 404 returned error can't find the container with id 6c7074c03bc05f6e98690b0616dea3fbf3c2d104ed24f833bc856ff2fa3065f7 Apr 16 19:06:44.724878 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:44.724844 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" event={"ID":"39b69521-47e9-4ea2-8e8e-365476f8a50d","Type":"ContainerStarted","Data":"ee0010b79b9cc12f639995e3d01a485b80f94b76426e8c21adf305338d445739"} Apr 16 19:06:44.724878 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:44.724878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" event={"ID":"39b69521-47e9-4ea2-8e8e-365476f8a50d","Type":"ContainerStarted","Data":"6c7074c03bc05f6e98690b0616dea3fbf3c2d104ed24f833bc856ff2fa3065f7"} Apr 16 19:06:46.498866 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.498821 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.69:8080: connect: connection refused" Apr 16 19:06:46.733228 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.733163 2578 generic.go:358] "Generic (PLEG): container finished" podID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerID="bdb62ed7a015c2fdcd67c0a9f9531ecdc15243c44477caf781974956cb83209c" exitCode=0 Apr 16 19:06:46.733380 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.733230 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" event={"ID":"6eb3969c-3518-4145-90c3-b3a1056c2277","Type":"ContainerDied","Data":"bdb62ed7a015c2fdcd67c0a9f9531ecdc15243c44477caf781974956cb83209c"} Apr 16 19:06:46.792959 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.792938 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:06:46.870978 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.870945 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6eb3969c-3518-4145-90c3-b3a1056c2277-kserve-provision-location\") pod \"6eb3969c-3518-4145-90c3-b3a1056c2277\" (UID: \"6eb3969c-3518-4145-90c3-b3a1056c2277\") " Apr 16 19:06:46.871145 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.871000 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6eb3969c-3518-4145-90c3-b3a1056c2277-cabundle-cert\") pod \"6eb3969c-3518-4145-90c3-b3a1056c2277\" (UID: \"6eb3969c-3518-4145-90c3-b3a1056c2277\") " Apr 16 19:06:46.871299 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.871280 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb3969c-3518-4145-90c3-b3a1056c2277-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6eb3969c-3518-4145-90c3-b3a1056c2277" (UID: "6eb3969c-3518-4145-90c3-b3a1056c2277"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:06:46.871361 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.871342 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eb3969c-3518-4145-90c3-b3a1056c2277-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6eb3969c-3518-4145-90c3-b3a1056c2277" (UID: "6eb3969c-3518-4145-90c3-b3a1056c2277"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:06:46.971684 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.971652 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6eb3969c-3518-4145-90c3-b3a1056c2277-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:06:46.971684 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:46.971678 2578 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6eb3969c-3518-4145-90c3-b3a1056c2277-cabundle-cert\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:06:47.737436 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:47.737407 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_39b69521-47e9-4ea2-8e8e-365476f8a50d/storage-initializer/0.log" Apr 16 19:06:47.737877 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:47.737445 2578 generic.go:358] "Generic (PLEG): container finished" podID="39b69521-47e9-4ea2-8e8e-365476f8a50d" containerID="ee0010b79b9cc12f639995e3d01a485b80f94b76426e8c21adf305338d445739" exitCode=1 Apr 16 19:06:47.737877 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:47.737531 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" event={"ID":"39b69521-47e9-4ea2-8e8e-365476f8a50d","Type":"ContainerDied","Data":"ee0010b79b9cc12f639995e3d01a485b80f94b76426e8c21adf305338d445739"} Apr 16 19:06:47.739067 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:47.739050 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" Apr 16 19:06:47.739159 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:47.739051 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86" event={"ID":"6eb3969c-3518-4145-90c3-b3a1056c2277","Type":"ContainerDied","Data":"98718fec4c44c8014cdc0b6cb223f0c4faf7775c16cfd5aaf723422e5dc50bb3"} Apr 16 19:06:47.739159 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:47.739149 2578 scope.go:117] "RemoveContainer" containerID="bdb62ed7a015c2fdcd67c0a9f9531ecdc15243c44477caf781974956cb83209c" Apr 16 19:06:47.748720 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:47.748703 2578 scope.go:117] "RemoveContainer" containerID="ea2fff5897f792ee295a16ceb13d77b698b02f8cae0666b01f86b8c90a0b3e67" Apr 16 19:06:47.769758 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:47.769737 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86"] Apr 16 19:06:47.773888 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:47.773870 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-66db4f849b-c6b86"] Apr 16 19:06:48.748802 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:48.748778 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_39b69521-47e9-4ea2-8e8e-365476f8a50d/storage-initializer/0.log" Apr 16 19:06:48.749274 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:48.748867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" event={"ID":"39b69521-47e9-4ea2-8e8e-365476f8a50d","Type":"ContainerStarted","Data":"8000d26d4b29f618b880412251ed6fe829bd4266223f693acf6affa2623ac95a"} Apr 16 19:06:49.648405 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:49.648375 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" path="/var/lib/kubelet/pods/6eb3969c-3518-4145-90c3-b3a1056c2277/volumes" Apr 16 19:06:50.758035 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:50.757962 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_39b69521-47e9-4ea2-8e8e-365476f8a50d/storage-initializer/1.log" Apr 16 19:06:50.758418 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:50.758283 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_39b69521-47e9-4ea2-8e8e-365476f8a50d/storage-initializer/0.log" Apr 16 19:06:50.758418 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:50.758313 2578 generic.go:358] "Generic (PLEG): container finished" podID="39b69521-47e9-4ea2-8e8e-365476f8a50d" containerID="8000d26d4b29f618b880412251ed6fe829bd4266223f693acf6affa2623ac95a" exitCode=1 Apr 16 19:06:50.758418 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:50.758364 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" event={"ID":"39b69521-47e9-4ea2-8e8e-365476f8a50d","Type":"ContainerDied","Data":"8000d26d4b29f618b880412251ed6fe829bd4266223f693acf6affa2623ac95a"} Apr 16 19:06:50.758418 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:50.758392 2578 scope.go:117] "RemoveContainer" containerID="ee0010b79b9cc12f639995e3d01a485b80f94b76426e8c21adf305338d445739" Apr 16 19:06:50.758766 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:50.758748 2578 scope.go:117] "RemoveContainer" containerID="ee0010b79b9cc12f639995e3d01a485b80f94b76426e8c21adf305338d445739" Apr 16 19:06:50.768850 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:06:50.768821 2578 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_kserve-ci-e2e-test_39b69521-47e9-4ea2-8e8e-365476f8a50d_0 in pod sandbox 6c7074c03bc05f6e98690b0616dea3fbf3c2d104ed24f833bc856ff2fa3065f7 from index: no such id: 'ee0010b79b9cc12f639995e3d01a485b80f94b76426e8c21adf305338d445739'" containerID="ee0010b79b9cc12f639995e3d01a485b80f94b76426e8c21adf305338d445739" Apr 16 19:06:50.768913 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:06:50.768870 2578 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_kserve-ci-e2e-test_39b69521-47e9-4ea2-8e8e-365476f8a50d_0 in pod sandbox 6c7074c03bc05f6e98690b0616dea3fbf3c2d104ed24f833bc856ff2fa3065f7 from index: no such id: 'ee0010b79b9cc12f639995e3d01a485b80f94b76426e8c21adf305338d445739'; Skipping pod \"isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_kserve-ci-e2e-test(39b69521-47e9-4ea2-8e8e-365476f8a50d)\"" logger="UnhandledError" Apr 16 19:06:50.770217 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:06:50.770178 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_kserve-ci-e2e-test(39b69521-47e9-4ea2-8e8e-365476f8a50d)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" podUID="39b69521-47e9-4ea2-8e8e-365476f8a50d" Apr 16 19:06:51.762346 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:51.762318 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_39b69521-47e9-4ea2-8e8e-365476f8a50d/storage-initializer/1.log" Apr 16 19:06:53.512883 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.512839 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7"] Apr 16 19:06:53.633569 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.633550 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_39b69521-47e9-4ea2-8e8e-365476f8a50d/storage-initializer/1.log" Apr 16 19:06:53.633696 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.633608 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" Apr 16 19:06:53.729661 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.729615 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39b69521-47e9-4ea2-8e8e-365476f8a50d-kserve-provision-location\") pod \"39b69521-47e9-4ea2-8e8e-365476f8a50d\" (UID: \"39b69521-47e9-4ea2-8e8e-365476f8a50d\") " Apr 16 19:06:53.729897 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.729875 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39b69521-47e9-4ea2-8e8e-365476f8a50d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "39b69521-47e9-4ea2-8e8e-365476f8a50d" (UID: "39b69521-47e9-4ea2-8e8e-365476f8a50d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:06:53.770868 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.770794 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7_39b69521-47e9-4ea2-8e8e-365476f8a50d/storage-initializer/1.log" Apr 16 19:06:53.770996 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.770876 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" event={"ID":"39b69521-47e9-4ea2-8e8e-365476f8a50d","Type":"ContainerDied","Data":"6c7074c03bc05f6e98690b0616dea3fbf3c2d104ed24f833bc856ff2fa3065f7"} Apr 16 19:06:53.770996 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.770900 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7" Apr 16 19:06:53.770996 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.770903 2578 scope.go:117] "RemoveContainer" containerID="8000d26d4b29f618b880412251ed6fe829bd4266223f693acf6affa2623ac95a" Apr 16 19:06:53.804012 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.803981 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7"] Apr 16 19:06:53.808651 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.808625 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-7bfbcdd54b-zcht7"] Apr 16 19:06:53.830411 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:53.830390 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/39b69521-47e9-4ea2-8e8e-365476f8a50d-kserve-provision-location\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:06:55.365620 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.365584 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xcjx6/must-gather-nzdjm"] Apr 16 19:06:55.365988 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.365932 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="storage-initializer" Apr 16 19:06:55.365988 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.365944 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="storage-initializer" Apr 16 19:06:55.365988 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.365953 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39b69521-47e9-4ea2-8e8e-365476f8a50d" containerName="storage-initializer" Apr 16 19:06:55.365988 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.365958 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b69521-47e9-4ea2-8e8e-365476f8a50d" containerName="storage-initializer" Apr 16 19:06:55.365988 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.365968 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" Apr 16 19:06:55.365988 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.365974 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" Apr 16 19:06:55.365988 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.365987 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39b69521-47e9-4ea2-8e8e-365476f8a50d" containerName="storage-initializer" Apr 16 19:06:55.365988 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.365992 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b69521-47e9-4ea2-8e8e-365476f8a50d" containerName="storage-initializer" Apr 16 19:06:55.366285 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.366041 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="39b69521-47e9-4ea2-8e8e-365476f8a50d" containerName="storage-initializer" Apr 16 19:06:55.366285 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.366051 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="39b69521-47e9-4ea2-8e8e-365476f8a50d" containerName="storage-initializer" Apr 16 19:06:55.366285 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.366057 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6eb3969c-3518-4145-90c3-b3a1056c2277" containerName="kserve-container" Apr 16 19:06:55.370700 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.370684 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:06:55.372941 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.372920 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xcjx6\"/\"default-dockercfg-tw5t9\"" Apr 16 19:06:55.373093 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.373015 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xcjx6\"/\"kube-root-ca.crt\"" Apr 16 19:06:55.373314 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.373298 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xcjx6\"/\"openshift-service-ca.crt\"" Apr 16 19:06:55.381404 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.381381 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xcjx6/must-gather-nzdjm"] Apr 16 19:06:55.444640 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.444603 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b54b1e28-380a-4a1d-9410-980524be1f75-must-gather-output\") pod \"must-gather-nzdjm\" (UID: \"b54b1e28-380a-4a1d-9410-980524be1f75\") " pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:06:55.444800 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.444672 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsm9k\" (UniqueName: \"kubernetes.io/projected/b54b1e28-380a-4a1d-9410-980524be1f75-kube-api-access-hsm9k\") pod \"must-gather-nzdjm\" (UID: \"b54b1e28-380a-4a1d-9410-980524be1f75\") " pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:06:55.545524 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.545491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b54b1e28-380a-4a1d-9410-980524be1f75-must-gather-output\") pod \"must-gather-nzdjm\" (UID: \"b54b1e28-380a-4a1d-9410-980524be1f75\") " pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:06:55.545672 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.545551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsm9k\" (UniqueName: \"kubernetes.io/projected/b54b1e28-380a-4a1d-9410-980524be1f75-kube-api-access-hsm9k\") pod \"must-gather-nzdjm\" (UID: \"b54b1e28-380a-4a1d-9410-980524be1f75\") " pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:06:55.545823 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.545802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b54b1e28-380a-4a1d-9410-980524be1f75-must-gather-output\") pod \"must-gather-nzdjm\" (UID: \"b54b1e28-380a-4a1d-9410-980524be1f75\") " pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:06:55.553126 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.553098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsm9k\" (UniqueName: \"kubernetes.io/projected/b54b1e28-380a-4a1d-9410-980524be1f75-kube-api-access-hsm9k\") pod \"must-gather-nzdjm\" (UID: \"b54b1e28-380a-4a1d-9410-980524be1f75\") " pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:06:55.648860 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.648779 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b69521-47e9-4ea2-8e8e-365476f8a50d" path="/var/lib/kubelet/pods/39b69521-47e9-4ea2-8e8e-365476f8a50d/volumes" Apr 16 19:06:55.693729 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.693702 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:06:55.813505 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:55.813412 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xcjx6/must-gather-nzdjm"] Apr 16 19:06:55.816227 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:06:55.816185 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb54b1e28_380a_4a1d_9410_980524be1f75.slice/crio-84ff5b26d617c370865d9697915b1556e6169759bece144f04af47efc34bffae WatchSource:0}: Error finding container 84ff5b26d617c370865d9697915b1556e6169759bece144f04af47efc34bffae: Status 404 returned error can't find the container with id 84ff5b26d617c370865d9697915b1556e6169759bece144f04af47efc34bffae Apr 16 19:06:56.787024 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:06:56.786858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" event={"ID":"b54b1e28-380a-4a1d-9410-980524be1f75","Type":"ContainerStarted","Data":"84ff5b26d617c370865d9697915b1556e6169759bece144f04af47efc34bffae"} Apr 16 19:07:00.807200 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:00.807160 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" event={"ID":"b54b1e28-380a-4a1d-9410-980524be1f75","Type":"ContainerStarted","Data":"6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f"} Apr 16 19:07:00.807557 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:00.807217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" event={"ID":"b54b1e28-380a-4a1d-9410-980524be1f75","Type":"ContainerStarted","Data":"06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16"} Apr 16 19:07:00.826705 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:00.826643 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" podStartSLOduration=1.8208361069999999 podStartE2EDuration="5.826626877s" podCreationTimestamp="2026-04-16 19:06:55 +0000 UTC" firstStartedPulling="2026-04-16 19:06:55.81790361 +0000 UTC m=+3810.761516648" lastFinishedPulling="2026-04-16 19:06:59.82369438 +0000 UTC m=+3814.767307418" observedRunningTime="2026-04-16 19:07:00.823439977 +0000 UTC m=+3815.767053033" watchObservedRunningTime="2026-04-16 19:07:00.826626877 +0000 UTC m=+3815.770239933" Apr 16 19:07:21.875738 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:21.875650 2578 generic.go:358] "Generic (PLEG): container finished" podID="b54b1e28-380a-4a1d-9410-980524be1f75" containerID="06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16" exitCode=0 Apr 16 19:07:21.875738 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:21.875716 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" event={"ID":"b54b1e28-380a-4a1d-9410-980524be1f75","Type":"ContainerDied","Data":"06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16"} Apr 16 19:07:21.876254 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:21.876019 2578 scope.go:117] "RemoveContainer" containerID="06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16" Apr 16 19:07:22.576673 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:22.576644 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xcjx6_must-gather-nzdjm_b54b1e28-380a-4a1d-9410-980524be1f75/gather/0.log" Apr 16 19:07:23.245141 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.245105 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s228v/must-gather-7fklx"] Apr 16 19:07:23.249021 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.249002 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s228v/must-gather-7fklx" Apr 16 19:07:23.251091 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.251071 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s228v\"/\"kube-root-ca.crt\"" Apr 16 19:07:23.251486 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.251465 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-s228v\"/\"default-dockercfg-8h7sl\"" Apr 16 19:07:23.251578 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.251490 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s228v\"/\"openshift-service-ca.crt\"" Apr 16 19:07:23.261260 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.261238 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s228v/must-gather-7fklx"] Apr 16 19:07:23.289579 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.289553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69219b84-bf6c-42f4-8be7-cf5ffe3dd76a-must-gather-output\") pod \"must-gather-7fklx\" (UID: \"69219b84-bf6c-42f4-8be7-cf5ffe3dd76a\") " pod="openshift-must-gather-s228v/must-gather-7fklx" Apr 16 19:07:23.289724 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.289583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng9qt\" (UniqueName: \"kubernetes.io/projected/69219b84-bf6c-42f4-8be7-cf5ffe3dd76a-kube-api-access-ng9qt\") pod \"must-gather-7fklx\" (UID: \"69219b84-bf6c-42f4-8be7-cf5ffe3dd76a\") " pod="openshift-must-gather-s228v/must-gather-7fklx" Apr 16 19:07:23.391099 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.391059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69219b84-bf6c-42f4-8be7-cf5ffe3dd76a-must-gather-output\") pod \"must-gather-7fklx\" (UID: \"69219b84-bf6c-42f4-8be7-cf5ffe3dd76a\") " pod="openshift-must-gather-s228v/must-gather-7fklx" Apr 16 19:07:23.391099 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.391098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng9qt\" (UniqueName: \"kubernetes.io/projected/69219b84-bf6c-42f4-8be7-cf5ffe3dd76a-kube-api-access-ng9qt\") pod \"must-gather-7fklx\" (UID: \"69219b84-bf6c-42f4-8be7-cf5ffe3dd76a\") " pod="openshift-must-gather-s228v/must-gather-7fklx" Apr 16 19:07:23.391413 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.391394 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69219b84-bf6c-42f4-8be7-cf5ffe3dd76a-must-gather-output\") pod \"must-gather-7fklx\" (UID: \"69219b84-bf6c-42f4-8be7-cf5ffe3dd76a\") " pod="openshift-must-gather-s228v/must-gather-7fklx" Apr 16 19:07:23.398977 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.398954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng9qt\" (UniqueName: \"kubernetes.io/projected/69219b84-bf6c-42f4-8be7-cf5ffe3dd76a-kube-api-access-ng9qt\") pod \"must-gather-7fklx\" (UID: \"69219b84-bf6c-42f4-8be7-cf5ffe3dd76a\") " pod="openshift-must-gather-s228v/must-gather-7fklx" Apr 16 19:07:23.557945 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.557855 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s228v/must-gather-7fklx" Apr 16 19:07:23.678120 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.678096 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s228v/must-gather-7fklx"] Apr 16 19:07:23.680792 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:07:23.680763 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69219b84_bf6c_42f4_8be7_cf5ffe3dd76a.slice/crio-b75d85b015fbc152294a44f3c8d34b587445be22e46efcdd54bcd3c54ea708f2 WatchSource:0}: Error finding container b75d85b015fbc152294a44f3c8d34b587445be22e46efcdd54bcd3c54ea708f2: Status 404 returned error can't find the container with id b75d85b015fbc152294a44f3c8d34b587445be22e46efcdd54bcd3c54ea708f2 Apr 16 19:07:23.682440 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.682420 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:07:23.883933 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:23.883849 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s228v/must-gather-7fklx" event={"ID":"69219b84-bf6c-42f4-8be7-cf5ffe3dd76a","Type":"ContainerStarted","Data":"b75d85b015fbc152294a44f3c8d34b587445be22e46efcdd54bcd3c54ea708f2"} Apr 16 19:07:24.888475 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:24.888441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s228v/must-gather-7fklx" event={"ID":"69219b84-bf6c-42f4-8be7-cf5ffe3dd76a","Type":"ContainerStarted","Data":"d605628f0f1bd196632343c2491a52f820b742d8f2f4cb8b0410d42001c3ed4c"} Apr 16 19:07:24.888848 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:24.888481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s228v/must-gather-7fklx" event={"ID":"69219b84-bf6c-42f4-8be7-cf5ffe3dd76a","Type":"ContainerStarted","Data":"70b95fdd65c727b332f1c1ef79b864573d12fb5668b051405c054bc79af2ebec"} Apr 16 19:07:24.907272 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:24.907222 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s228v/must-gather-7fklx" podStartSLOduration=1.187236469 podStartE2EDuration="1.907203039s" podCreationTimestamp="2026-04-16 19:07:23 +0000 UTC" firstStartedPulling="2026-04-16 19:07:23.682545693 +0000 UTC m=+3838.626158726" lastFinishedPulling="2026-04-16 19:07:24.40251226 +0000 UTC m=+3839.346125296" observedRunningTime="2026-04-16 19:07:24.905538852 +0000 UTC m=+3839.849151910" watchObservedRunningTime="2026-04-16 19:07:24.907203039 +0000 UTC m=+3839.850816091" Apr 16 19:07:26.130578 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:26.130546 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rvmjz_aaac5506-c30b-46a4-b8d2-8cffc2dc83d7/global-pull-secret-syncer/0.log" Apr 16 19:07:26.273333 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:26.273282 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hhkz9_71ef152c-1129-42dc-8a47-99e2ea30df5b/konnectivity-agent/0.log" Apr 16 19:07:26.406324 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:26.406237 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-88.ec2.internal_a22c454fc2ba5924d3a6892717c717ec/haproxy/0.log" Apr 16 19:07:28.078161 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.078117 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xcjx6/must-gather-nzdjm"] Apr 16 19:07:28.078661 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.078454 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" containerName="copy" containerID="cri-o://6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f" gracePeriod=2 Apr 16 19:07:28.081573 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.081543 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xcjx6/must-gather-nzdjm"] Apr 16 19:07:28.081789 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.081763 2578 status_manager.go:895] "Failed to get status for pod" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" err="pods \"must-gather-nzdjm\" is forbidden: User \"system:node:ip-10-0-139-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xcjx6\": no relationship found between node 'ip-10-0-139-88.ec2.internal' and this object" Apr 16 19:07:28.490449 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.488492 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xcjx6_must-gather-nzdjm_b54b1e28-380a-4a1d-9410-980524be1f75/copy/0.log" Apr 16 19:07:28.490917 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.490896 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:07:28.492692 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.492661 2578 status_manager.go:895] "Failed to get status for pod" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" err="pods \"must-gather-nzdjm\" is forbidden: User \"system:node:ip-10-0-139-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xcjx6\": no relationship found between node 'ip-10-0-139-88.ec2.internal' and this object" Apr 16 19:07:28.563831 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.561824 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsm9k\" (UniqueName: \"kubernetes.io/projected/b54b1e28-380a-4a1d-9410-980524be1f75-kube-api-access-hsm9k\") pod \"b54b1e28-380a-4a1d-9410-980524be1f75\" (UID: \"b54b1e28-380a-4a1d-9410-980524be1f75\") " Apr 16 19:07:28.563831 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.561884 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b54b1e28-380a-4a1d-9410-980524be1f75-must-gather-output\") pod \"b54b1e28-380a-4a1d-9410-980524be1f75\" (UID: \"b54b1e28-380a-4a1d-9410-980524be1f75\") " Apr 16 19:07:28.563831 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.563523 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54b1e28-380a-4a1d-9410-980524be1f75-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b54b1e28-380a-4a1d-9410-980524be1f75" (UID: "b54b1e28-380a-4a1d-9410-980524be1f75"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:07:28.571459 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.571416 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54b1e28-380a-4a1d-9410-980524be1f75-kube-api-access-hsm9k" (OuterVolumeSpecName: "kube-api-access-hsm9k") pod "b54b1e28-380a-4a1d-9410-980524be1f75" (UID: "b54b1e28-380a-4a1d-9410-980524be1f75"). InnerVolumeSpecName "kube-api-access-hsm9k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:07:28.662992 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.662877 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hsm9k\" (UniqueName: \"kubernetes.io/projected/b54b1e28-380a-4a1d-9410-980524be1f75-kube-api-access-hsm9k\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:07:28.662992 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.662915 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b54b1e28-380a-4a1d-9410-980524be1f75-must-gather-output\") on node \"ip-10-0-139-88.ec2.internal\" DevicePath \"\"" Apr 16 19:07:28.905346 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.905308 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xcjx6_must-gather-nzdjm_b54b1e28-380a-4a1d-9410-980524be1f75/copy/0.log" Apr 16 19:07:28.906266 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.906239 2578 generic.go:358] "Generic (PLEG): container finished" podID="b54b1e28-380a-4a1d-9410-980524be1f75" containerID="6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f" exitCode=143 Apr 16 19:07:28.906521 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.906316 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" Apr 16 19:07:28.906521 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.906347 2578 scope.go:117] "RemoveContainer" containerID="6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f" Apr 16 19:07:28.908767 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.908737 2578 status_manager.go:895] "Failed to get status for pod" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" err="pods \"must-gather-nzdjm\" is forbidden: User \"system:node:ip-10-0-139-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xcjx6\": no relationship found between node 'ip-10-0-139-88.ec2.internal' and this object" Apr 16 19:07:28.925713 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.925315 2578 scope.go:117] "RemoveContainer" containerID="06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16" Apr 16 19:07:28.925713 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.925523 2578 status_manager.go:895] "Failed to get status for pod" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" pod="openshift-must-gather-xcjx6/must-gather-nzdjm" err="pods \"must-gather-nzdjm\" is forbidden: User \"system:node:ip-10-0-139-88.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xcjx6\": no relationship found between node 'ip-10-0-139-88.ec2.internal' and this object" Apr 16 19:07:28.947908 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.947879 2578 scope.go:117] "RemoveContainer" containerID="6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f" Apr 16 19:07:28.948545 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:07:28.948517 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f\": container with ID starting with 6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f not found: ID does not exist" containerID="6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f" Apr 16 19:07:28.948722 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.948698 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f"} err="failed to get container status \"6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f\": rpc error: code = NotFound desc = could not find container \"6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f\": container with ID starting with 6963c242ae8b6732a58157430852835ebf3c537a5624ed84f75af0b1beb9e95f not found: ID does not exist" Apr 16 19:07:28.948828 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.948817 2578 scope.go:117] "RemoveContainer" containerID="06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16" Apr 16 19:07:28.949345 ip-10-0-139-88 kubenswrapper[2578]: E0416 19:07:28.949280 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16\": container with ID starting with 06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16 not found: ID does not exist" containerID="06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16" Apr 16 19:07:28.949345 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:28.949310 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16"} err="failed to get container status \"06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16\": rpc error: code = NotFound desc = could not find container \"06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16\": container with ID starting with 06e697c238a3af5715121b81214e1eca9d54ae75f2a875d62a485139e47f1d16 not found: ID does not exist" Apr 16 19:07:29.651948 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:29.651907 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" path="/var/lib/kubelet/pods/b54b1e28-380a-4a1d-9410-980524be1f75/volumes" Apr 16 19:07:29.710093 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:29.710057 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_17e65dd5-ef7e-42a0-a302-3785c49c48ab/alertmanager/0.log" Apr 16 19:07:29.742054 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:29.742025 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_17e65dd5-ef7e-42a0-a302-3785c49c48ab/config-reloader/0.log" Apr 16 19:07:29.774061 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:29.774029 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_17e65dd5-ef7e-42a0-a302-3785c49c48ab/kube-rbac-proxy-web/0.log" Apr 16 19:07:29.805768 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:29.805743 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_17e65dd5-ef7e-42a0-a302-3785c49c48ab/kube-rbac-proxy/0.log" Apr 16 19:07:29.832914 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:29.832882 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_17e65dd5-ef7e-42a0-a302-3785c49c48ab/kube-rbac-proxy-metric/0.log" Apr 16 19:07:29.860322 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:29.860294 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_17e65dd5-ef7e-42a0-a302-3785c49c48ab/prom-label-proxy/0.log" Apr 16 19:07:29.892428 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:29.892390 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_17e65dd5-ef7e-42a0-a302-3785c49c48ab/init-config-reloader/0.log" Apr 16 19:07:30.010017 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.009990 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-tmbf4_ece5b02a-1c86-402d-a07a-3645d98afe73/kube-state-metrics/0.log" Apr 16 19:07:30.065291 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.065265 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-tmbf4_ece5b02a-1c86-402d-a07a-3645d98afe73/kube-rbac-proxy-main/0.log" Apr 16 19:07:30.119762 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.119732 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-tmbf4_ece5b02a-1c86-402d-a07a-3645d98afe73/kube-rbac-proxy-self/0.log" Apr 16 19:07:30.171398 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.171373 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-78dcd49d79-g868h_df893896-7d60-4285-a325-30152ee3c5bd/metrics-server/0.log" Apr 16 19:07:30.202283 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.202255 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-52qb6_008efc52-3ff5-42b0-985a-90b8699c1cda/monitoring-plugin/0.log" Apr 16 19:07:30.239792 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.239763 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9vhnl_7e057c09-2a88-4a81-99a0-c209f07556a8/node-exporter/0.log" Apr 16 19:07:30.269768 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.269692 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9vhnl_7e057c09-2a88-4a81-99a0-c209f07556a8/kube-rbac-proxy/0.log" Apr 16 19:07:30.298320 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.298298 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9vhnl_7e057c09-2a88-4a81-99a0-c209f07556a8/init-textfile/0.log" Apr 16 19:07:30.505579 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.505544 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-vz96w_9bae2b33-3e4a-4468-84d7-208d6ae92a1a/kube-rbac-proxy-main/0.log" Apr 16 19:07:30.535657 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.535544 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-vz96w_9bae2b33-3e4a-4468-84d7-208d6ae92a1a/kube-rbac-proxy-self/0.log" Apr 16 19:07:30.565962 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.565929 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-vz96w_9bae2b33-3e4a-4468-84d7-208d6ae92a1a/openshift-state-metrics/0.log" Apr 16 19:07:30.637675 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.637640 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ca490224-c264-49dc-b0ac-bb60473d99d5/prometheus/0.log" Apr 16 19:07:30.663523 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.663496 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ca490224-c264-49dc-b0ac-bb60473d99d5/config-reloader/0.log" Apr 16 19:07:30.692093 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.692062 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ca490224-c264-49dc-b0ac-bb60473d99d5/thanos-sidecar/0.log" Apr 16 19:07:30.722525 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.722486 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ca490224-c264-49dc-b0ac-bb60473d99d5/kube-rbac-proxy-web/0.log" Apr 16 19:07:30.750901 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.750869 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ca490224-c264-49dc-b0ac-bb60473d99d5/kube-rbac-proxy/0.log" Apr 16 19:07:30.780334 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.780300 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ca490224-c264-49dc-b0ac-bb60473d99d5/kube-rbac-proxy-thanos/0.log" Apr 16 19:07:30.808855 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.808747 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ca490224-c264-49dc-b0ac-bb60473d99d5/init-config-reloader/0.log" Apr 16 19:07:30.945396 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.945365 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-684bc67856-gnfsf_3e82dd6c-3235-423d-a5db-7a937f30e2ed/telemeter-client/0.log" Apr 16 19:07:30.976540 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:30.976511 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-684bc67856-gnfsf_3e82dd6c-3235-423d-a5db-7a937f30e2ed/reload/0.log" Apr 16 19:07:31.005017 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:31.004982 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-684bc67856-gnfsf_3e82dd6c-3235-423d-a5db-7a937f30e2ed/kube-rbac-proxy/0.log" Apr 16 19:07:31.059504 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:31.059426 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/thanos-query/0.log" Apr 16 19:07:31.110336 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:31.110219 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/kube-rbac-proxy-web/0.log" Apr 16 19:07:31.161059 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:31.160881 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/kube-rbac-proxy/0.log" Apr 16 19:07:31.206121 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:31.206091 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/prom-label-proxy/0.log" Apr 16 19:07:31.240074 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:31.240034 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/kube-rbac-proxy-rules/0.log" Apr 16 19:07:31.279559 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:31.279521 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7675b699c9-9254t_9f3e75c0-4839-425b-951d-c38abf6a16b5/kube-rbac-proxy-metrics/0.log" Apr 16 19:07:33.351963 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.351929 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv"] Apr 16 19:07:33.352491 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.352475 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" containerName="gather" Apr 16 19:07:33.352545 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.352495 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" containerName="gather" Apr 16 19:07:33.352545 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.352525 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" containerName="copy" Apr 16 19:07:33.352545 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.352533 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" containerName="copy" Apr 16 19:07:33.352649 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.352597 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" containerName="copy" Apr 16 19:07:33.352649 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.352617 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b54b1e28-380a-4a1d-9410-980524be1f75" containerName="gather" Apr 16 19:07:33.356802 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.356784 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.388265 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.388231 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv"] Apr 16 19:07:33.509831 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.509787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-proc\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.510040 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.509855 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-lib-modules\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.510040 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.509917 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-sys\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.510040 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.509938 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69lmk\" (UniqueName: \"kubernetes.io/projected/def0ff13-2d48-478a-a61d-1536e0edb649-kube-api-access-69lmk\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.510040 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.509978 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-podres\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.611048 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.610956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-proc\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.611048 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.611022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-lib-modules\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.611305 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.611075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-sys\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.611305 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.611099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69lmk\" (UniqueName: \"kubernetes.io/projected/def0ff13-2d48-478a-a61d-1536e0edb649-kube-api-access-69lmk\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.611305 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.611137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-podres\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.611465 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.611343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-podres\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.611465 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.611410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-proc\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.611582 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.611491 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-lib-modules\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.611582 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.611536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def0ff13-2d48-478a-a61d-1536e0edb649-sys\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.624002 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.623977 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69lmk\" (UniqueName: \"kubernetes.io/projected/def0ff13-2d48-478a-a61d-1536e0edb649-kube-api-access-69lmk\") pod \"perf-node-gather-daemonset-l6mjv\" (UID: \"def0ff13-2d48-478a-a61d-1536e0edb649\") " pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.667680 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.667650 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:33.809136 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.809106 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv"] Apr 16 19:07:33.812953 ip-10-0-139-88 kubenswrapper[2578]: W0416 19:07:33.812924 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddef0ff13_2d48_478a_a61d_1536e0edb649.slice/crio-b4fe249ede710a960b062828c4f057f62f54943026809d5fcbbc8f3e6ddf8337 WatchSource:0}: Error finding container b4fe249ede710a960b062828c4f057f62f54943026809d5fcbbc8f3e6ddf8337: Status 404 returned error can't find the container with id b4fe249ede710a960b062828c4f057f62f54943026809d5fcbbc8f3e6ddf8337 Apr 16 19:07:33.927840 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:33.927763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" event={"ID":"def0ff13-2d48-478a-a61d-1536e0edb649","Type":"ContainerStarted","Data":"b4fe249ede710a960b062828c4f057f62f54943026809d5fcbbc8f3e6ddf8337"} Apr 16 19:07:34.373892 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:34.373868 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6dthq_b1164c58-e91c-4b0c-93e5-28d0244988b6/dns/0.log" Apr 16 19:07:34.397019 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:34.396989 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6dthq_b1164c58-e91c-4b0c-93e5-28d0244988b6/kube-rbac-proxy/0.log" Apr 16 19:07:34.473895 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:34.473865 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hxbd9_2811e81e-8a8b-4ab5-903d-22cce72663e2/dns-node-resolver/0.log" Apr 16 19:07:34.932306 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:34.932268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" event={"ID":"def0ff13-2d48-478a-a61d-1536e0edb649","Type":"ContainerStarted","Data":"0d22ab0078bbaf73c5942720dda4b9705d85eeda6d8e1db8a2d2a27c9185c70c"} Apr 16 19:07:34.932500 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:34.932481 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:34.949244 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:34.949183 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" podStartSLOduration=1.949168789 podStartE2EDuration="1.949168789s" podCreationTimestamp="2026-04-16 19:07:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:07:34.948072672 +0000 UTC m=+3849.891685728" watchObservedRunningTime="2026-04-16 19:07:34.949168789 +0000 UTC m=+3849.892781845" Apr 16 19:07:35.046471 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:35.046442 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-g8588_935e60a3-a3a8-4cd1-b842-f7f54efe8cb8/node-ca/0.log" Apr 16 19:07:36.194441 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:36.194366 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-79ghb_c118da2f-c36c-4fca-859c-34ed40076370/serve-healthcheck-canary/0.log" Apr 16 19:07:36.641447 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:36.641422 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7t9nd_1ab75e1c-1aff-4d37-a06b-03b4f389061d/kube-rbac-proxy/0.log" Apr 16 19:07:36.665530 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:36.665493 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7t9nd_1ab75e1c-1aff-4d37-a06b-03b4f389061d/exporter/0.log" Apr 16 19:07:36.690346 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:36.690320 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7t9nd_1ab75e1c-1aff-4d37-a06b-03b4f389061d/extractor/0.log" Apr 16 19:07:38.988230 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:38.988178 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-dpxb4_91d213d4-bf2d-4e43-8f99-8ffa7ad0ed63/manager/0.log" Apr 16 19:07:39.358656 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:39.358576 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-sr59x_d5d1d706-2c44-42fd-8877-a5959df4e519/s3-init/0.log" Apr 16 19:07:39.412610 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:39.412581 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-8dnj6_268b1e70-1f96-45bd-9864-b90de3b6d78d/s3-tls-init-custom/0.log" Apr 16 19:07:39.466276 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:39.466249 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-d8ts8_46b85c45-3c3c-4bc2-887b-1773012871f3/s3-tls-init-serving/0.log" Apr 16 19:07:40.948779 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:40.948740 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-s228v/perf-node-gather-daemonset-l6mjv" Apr 16 19:07:45.548569 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.548497 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vbbl2_2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77/kube-multus-additional-cni-plugins/0.log" Apr 16 19:07:45.596241 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.596208 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vbbl2_2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77/egress-router-binary-copy/0.log" Apr 16 19:07:45.632564 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.632524 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vbbl2_2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77/cni-plugins/0.log" Apr 16 19:07:45.661378 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.661320 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vbbl2_2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77/bond-cni-plugin/0.log" Apr 16 19:07:45.688325 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.688300 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vbbl2_2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77/routeoverride-cni/0.log" Apr 16 19:07:45.712800 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.712773 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vbbl2_2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77/whereabouts-cni-bincopy/0.log" Apr 16 19:07:45.738041 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.738010 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vbbl2_2aa8dcfb-ecd1-405d-aadb-af7f9cd89d77/whereabouts-cni/0.log" Apr 16 19:07:45.821962 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.821863 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rx2m9_2119e250-ea67-47ef-ab06-f2ae21b8044f/kube-multus/0.log" Apr 16 19:07:45.964620 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.964538 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-t49dd_87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0/network-metrics-daemon/0.log" Apr 16 19:07:45.982743 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:45.982717 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-t49dd_87aa22c0-e9f4-4d96-b5ce-9ac0e4521ab0/kube-rbac-proxy/0.log" Apr 16 19:07:46.760065 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:46.760037 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pmhr_6b4ee727-773d-47fd-8e52-976f88918e9d/ovn-controller/0.log" Apr 16 19:07:46.815761 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:46.815735 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pmhr_6b4ee727-773d-47fd-8e52-976f88918e9d/ovn-acl-logging/0.log" Apr 16 19:07:46.838088 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:46.838065 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pmhr_6b4ee727-773d-47fd-8e52-976f88918e9d/kube-rbac-proxy-node/0.log" Apr 16 19:07:46.864948 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:46.864924 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pmhr_6b4ee727-773d-47fd-8e52-976f88918e9d/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:07:46.889006 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:46.888982 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pmhr_6b4ee727-773d-47fd-8e52-976f88918e9d/northd/0.log" Apr 16 19:07:46.912693 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:46.912652 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pmhr_6b4ee727-773d-47fd-8e52-976f88918e9d/nbdb/0.log" Apr 16 19:07:46.939512 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:46.939487 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pmhr_6b4ee727-773d-47fd-8e52-976f88918e9d/sbdb/0.log" Apr 16 19:07:47.161534 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:47.161460 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2pmhr_6b4ee727-773d-47fd-8e52-976f88918e9d/ovnkube-controller/0.log" Apr 16 19:07:48.845710 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:48.845679 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9qqgk_4af28085-16ca-4a81-b155-9c85f1f05a68/network-check-target-container/0.log" Apr 16 19:07:49.833855 ip-10-0-139-88 kubenswrapper[2578]: I0416 19:07:49.833830 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2x6mt_552c454d-9201-4aa0-bce5-2d6ee55ba1c7/iptables-alerter/0.log"