Apr 22 18:43:57.723198 ip-10-0-128-208 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:43:57.723210 ip-10-0-128-208 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:43:57.723219 ip-10-0-128-208 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:43:57.723531 ip-10-0-128-208 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:44:07.864247 ip-10-0-128-208 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:44:07.864269 ip-10-0-128-208 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot feb9c2d996364604903755850dbbcc00 -- Apr 22 18:46:28.210354 ip-10-0-128-208 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:28.621149 ip-10-0-128-208 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:28.621149 ip-10-0-128-208 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:28.621149 ip-10-0-128-208 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:28.621149 ip-10-0-128-208 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:28.621149 ip-10-0-128-208 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:28.621871 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.621795 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:28.625455 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625440 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.625455 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625456 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625461 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625465 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625468 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625471 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625474 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625477 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625480 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625483 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625486 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625488 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625491 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625493 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625496 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625499 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625501 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625504 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625506 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625509 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.625524 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625513 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625517 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625520 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625524 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625527 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625529 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625532 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625535 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625538 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625541 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625543 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625546 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625548 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625550 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625553 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625556 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625558 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625561 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625563 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625566 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.626007 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625569 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625577 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625580 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625582 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625585 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625587 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625590 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625592 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625595 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625597 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625600 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625602 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625605 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625607 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625610 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625614 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625616 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625619 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625622 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625624 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.626640 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625627 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625629 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625633 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625635 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625638 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625640 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625649 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625651 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625654 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625656 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625659 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625661 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625664 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625666 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625669 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625671 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625674 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625676 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625679 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625681 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.627138 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625684 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625687 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625689 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625692 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625695 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.625697 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626133 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626139 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626142 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626144 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626147 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626150 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626152 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626155 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626157 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626160 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626162 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626165 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626169 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626171 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.627616 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626174 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626176 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626179 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626182 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626184 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626187 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626189 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626192 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626194 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626197 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626199 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626202 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626205 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626208 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626210 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626213 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626216 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626218 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626221 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626224 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.628110 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626226 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626229 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626231 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626234 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626236 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626239 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626241 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626244 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626246 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626248 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626251 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626254 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626256 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626259 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626262 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626264 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626266 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626269 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626271 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626274 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.628647 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626276 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626279 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626282 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626284 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626287 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626290 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626293 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626296 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626298 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626301 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626304 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626306 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626309 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626312 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626315 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626317 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626320 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626322 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626324 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626327 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.629148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626329 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626332 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626334 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626336 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626340 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626342 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626345 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626347 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626349 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626354 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626359 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.626363 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627687 2562 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627696 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627703 2562 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627708 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627712 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627716 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627721 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627725 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:28.629652 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627729 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627732 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627737 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627740 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627743 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627746 2562 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627749 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627752 2562 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627755 2562 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627758 2562 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627761 2562 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627765 2562 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627768 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627771 2562 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627773 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627777 2562 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627781 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627784 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627788 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627791 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627793 2562 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627797 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627800 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627803 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627806 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:28.630143 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627810 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627813 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627816 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627819 2562 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627822 2562 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627825 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627830 2562 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627833 2562 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627836 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627839 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627842 2562 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627846 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627849 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627852 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627855 2562 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627858 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627861 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627864 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627868 2562 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627870 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627873 2562 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627876 2562 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627880 2562 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627883 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627887 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:28.630741 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627890 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627893 2562 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627897 2562 flags.go:64] FLAG: --help="false" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627900 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-128-208.ec2.internal" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627903 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627906 2562 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627909 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627912 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627915 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627918 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627921 2562 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627924 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627927 2562 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627930 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627933 2562 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627936 2562 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627939 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627942 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627945 2562 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627948 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627951 2562 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627953 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627956 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627960 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:28.631352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627965 2562 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627967 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627970 2562 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627973 2562 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627976 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627979 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627982 2562 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627985 2562 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627990 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627993 2562 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.627997 2562 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628000 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628003 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628006 2562 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628009 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628023 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628026 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628029 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628037 2562 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628040 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628043 2562 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628046 2562 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628049 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:28.631963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628054 2562 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628057 2562 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628061 2562 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628063 2562 flags.go:64] FLAG: --port="10250" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628067 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628069 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-051ba35853604a9bb" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628073 2562 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628075 2562 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628078 2562 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628081 2562 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628084 2562 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628088 2562 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628091 2562 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628093 2562 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628096 2562 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628100 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628103 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628111 2562 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628114 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628117 2562 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628120 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628123 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628126 2562 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628129 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628132 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628135 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:28.632544 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628138 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628141 2562 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628144 2562 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628147 2562 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628150 2562 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628153 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628156 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628159 2562 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628162 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628167 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628170 2562 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628173 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628177 2562 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628180 2562 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628183 2562 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628185 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628188 2562 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628191 2562 flags.go:64] FLAG: --v="2" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628195 2562 flags.go:64] FLAG: --version="false" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628199 2562 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628203 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.628206 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628316 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628322 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.633183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628326 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628329 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628332 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628335 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628338 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628341 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628344 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628347 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628349 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628353 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628356 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628358 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628361 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628364 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628367 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628370 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628372 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628375 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628378 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628381 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.633772 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628383 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628387 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628390 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628393 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628396 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628398 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628401 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628403 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628405 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628408 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628410 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628414 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628418 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628421 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628424 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628427 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628429 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628432 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628434 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.634333 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628437 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628440 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628442 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628445 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628447 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628450 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628452 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628456 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628460 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628463 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628466 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628468 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628471 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628473 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628476 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628478 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628481 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628483 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628486 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.634802 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628488 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628491 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628493 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628496 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628498 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628502 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628506 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628508 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628511 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628514 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628517 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628519 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628522 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628524 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628527 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628529 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628532 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628535 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628537 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628540 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.635278 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628542 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628545 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628548 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628550 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628553 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.628555 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.629160 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.635164 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.635179 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635227 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635232 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635235 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635238 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635241 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635245 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635247 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.635762 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635250 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635252 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635255 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635258 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635260 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635263 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635266 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635269 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635272 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635275 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635277 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635280 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635283 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635286 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635289 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635292 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635294 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635297 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635299 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635302 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.636183 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635306 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635310 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635312 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635315 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635319 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635325 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635328 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635330 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635333 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635336 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635338 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635341 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635343 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635346 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635348 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635351 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635354 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635357 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635360 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635363 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.636675 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635365 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635368 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635371 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635374 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635377 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635381 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635383 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635386 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635388 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635391 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635393 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635396 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635398 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635401 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635404 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635406 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635410 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635413 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635416 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.637177 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635418 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635421 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635424 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635426 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635429 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635431 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635434 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635436 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635439 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635442 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635444 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635447 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635450 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635453 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635455 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635458 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635460 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635463 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635465 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.637674 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635467 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.635473 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635567 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635571 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635574 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635577 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635581 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635583 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635586 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635589 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635592 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635596 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635599 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635601 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635605 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.638153 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635609 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635612 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635614 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635617 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635620 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635622 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635625 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635627 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635630 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635633 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635635 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635638 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635641 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635643 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635646 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635648 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635651 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635653 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635656 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635658 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.638515 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635661 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635663 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635666 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635668 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635670 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635673 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635676 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635678 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635681 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635684 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635686 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635689 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635692 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635694 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635697 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635699 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635701 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635704 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635707 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635710 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.638992 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635713 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635715 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635718 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635720 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635722 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635725 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635728 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635730 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635733 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635735 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635737 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635740 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635742 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635745 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635747 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635750 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635752 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635755 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635757 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.639497 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635760 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635763 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635766 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635768 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635771 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635774 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635776 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635779 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635781 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635784 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635786 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635789 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635791 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:28.635794 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.635799 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:28.639962 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.636413 2562 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:28.640365 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.640146 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:28.641720 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.641707 2562 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:28.641820 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.641804 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:28.641854 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.641845 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:28.665760 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.665730 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:28.669566 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.669461 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:28.686286 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.686265 2562 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:28.691382 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.691367 2562 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:28.692633 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.692608 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:28.694552 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.694535 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:28.698163 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.698139 2562 fs.go:135] Filesystem UUIDs: map[36e81f28-89c1-4400-b1ad-35847de3d92a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c2ed4c67-cfd0-42ec-bc39-634ff1f21a49:/dev/nvme0n1p4] Apr 22 18:46:28.698224 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.698162 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:28.703664 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.703561 2562 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:28.701740434 +0000 UTC m=+0.380966717 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100275 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b742b7f0aeaee4c5d7cf5c893ecf9 SystemUUID:ec2b742b-7f0a-eaee-4c5d-7cf5c893ecf9 BootID:feb9c2d9-9636-4604-9037-55850dbbcc00 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:97:23:f1:ff:cd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:97:23:f1:ff:cd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:94:5e:7c:86:25 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:28.703664 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.703659 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:28.703777 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.703727 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:28.704762 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.704736 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:28.704884 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.704763 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-208.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:28.704930 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.704893 2562 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:28.704930 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.704902 2562 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:28.704930 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.704914 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:28.705808 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.705797 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:28.707116 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.707106 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:28.707218 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.707210 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:28.710280 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.710268 2562 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:28.710330 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.710290 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:28.710330 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.710305 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:28.710330 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.710314 2562 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:28.710330 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.710322 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:28.711895 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.711880 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:28.711895 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.711899 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:28.714751 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.714736 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:28.715627 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.715613 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dqdcl" Apr 22 18:46:28.716526 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.716513 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:28.717750 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717734 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:28.717821 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717768 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:28.717821 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717775 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:28.717821 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717781 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:28.717821 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717787 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:28.717821 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717792 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:28.717821 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717798 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:28.717821 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717803 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:28.717821 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717809 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:28.717821 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717815 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:28.718233 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717832 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:28.718233 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.717841 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:28.718633 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.718623 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:28.718633 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.718633 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:28.722488 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.721427 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-208.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:46:28.722488 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.721464 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:46:28.722488 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.722163 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:28.722488 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.722218 2562 server.go:1295] "Started kubelet" Apr 22 18:46:28.722699 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.722504 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:28.722699 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.722543 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:28.722699 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.722592 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:28.723069 ip-10-0-128-208 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:28.723780 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.723762 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-208.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:46:28.724127 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.724110 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:28.724502 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.724484 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dqdcl" Apr 22 18:46:28.727380 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.727366 2562 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:28.730740 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.730719 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:28.730740 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.730731 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:28.731334 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.731279 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:28.731334 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.731301 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:28.731334 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.731320 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:28.731503 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.731462 2562 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:28.731503 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.731470 2562 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:28.732659 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.732499 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:28.733197 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.733180 2562 factory.go:55] Registering systemd factory Apr 22 18:46:28.733298 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.733212 2562 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:28.733298 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.733180 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:28.733466 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.733445 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:28.733843 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.733630 2562 factory.go:153] Registering CRI-O factory Apr 22 18:46:28.733843 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.733646 2562 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:28.733843 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.733709 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:28.733843 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.733736 2562 factory.go:103] Registering Raw factory Apr 22 18:46:28.733843 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.733750 2562 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:28.734238 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.734210 2562 manager.go:319] Starting recovery of all containers Apr 22 18:46:28.737742 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.737718 2562 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-208.ec2.internal\" not found" node="ip-10-0-128-208.ec2.internal" Apr 22 18:46:28.742715 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.742701 2562 manager.go:324] Recovery completed Apr 22 18:46:28.747079 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.747065 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.749638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.749616 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.749729 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.749649 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.749729 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.749672 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.750474 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.750456 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:28.750474 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.750473 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:28.750613 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.750494 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:28.753054 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.753037 2562 policy_none.go:49] "None policy: Start" Apr 22 18:46:28.753054 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.753054 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:28.753179 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.753064 2562 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:28.803567 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.803553 2562 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:28.803700 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.803587 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:28.803700 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.803600 2562 server.go:85] "Starting device plugin registration server" Apr 22 18:46:28.803849 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.803837 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:28.803904 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.803851 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:28.803952 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.803942 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:28.804087 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.804031 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:28.804087 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.804046 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:28.804775 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.804753 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:28.804865 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.804789 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:28.834778 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.834747 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:28.835964 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.835942 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:28.836055 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.835968 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:28.836055 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.835983 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:28.836055 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.835989 2562 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:28.836055 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.836038 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:28.840027 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.839992 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:28.904113 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.904068 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.905122 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.905108 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.905176 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.905136 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.905176 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.905147 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.905176 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.905167 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-208.ec2.internal" Apr 22 18:46:28.914367 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.914354 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-208.ec2.internal" Apr 22 18:46:28.914421 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.914372 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-208.ec2.internal\": node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:28.933631 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.933614 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:28.936689 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.936668 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal"] Apr 22 18:46:28.936779 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.936728 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.937463 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.937449 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.937530 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.937475 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.937530 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.937486 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.938727 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.938715 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.938896 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.938883 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" Apr 22 18:46:28.938935 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.938910 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.939366 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.939350 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.939456 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.939374 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.939456 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.939387 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.939456 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.939397 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.939456 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.939418 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.939456 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.939428 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.940568 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.940552 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal" Apr 22 18:46:28.940650 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.940579 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.941230 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.941211 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.941304 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.941241 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.941304 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:28.941252 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.968626 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.968606 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-208.ec2.internal\" not found" node="ip-10-0-128-208.ec2.internal" Apr 22 18:46:28.972972 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:28.972957 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-208.ec2.internal\" not found" node="ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.032763 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.032745 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a4917e9ea21d72262d81705b2073ae76-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal\" (UID: \"a4917e9ea21d72262d81705b2073ae76\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.032867 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.032771 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4917e9ea21d72262d81705b2073ae76-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal\" (UID: \"a4917e9ea21d72262d81705b2073ae76\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.032867 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.032788 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e0c91486769e79b8cafe9f0bd44a6b4-config\") pod \"kube-apiserver-proxy-ip-10-0-128-208.ec2.internal\" (UID: \"2e0c91486769e79b8cafe9f0bd44a6b4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.033776 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.033761 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.133731 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.133704 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a4917e9ea21d72262d81705b2073ae76-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal\" (UID: \"a4917e9ea21d72262d81705b2073ae76\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.133844 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.133734 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4917e9ea21d72262d81705b2073ae76-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal\" (UID: \"a4917e9ea21d72262d81705b2073ae76\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.133844 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.133752 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e0c91486769e79b8cafe9f0bd44a6b4-config\") pod \"kube-apiserver-proxy-ip-10-0-128-208.ec2.internal\" (UID: \"2e0c91486769e79b8cafe9f0bd44a6b4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.133844 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.133788 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e0c91486769e79b8cafe9f0bd44a6b4-config\") pod \"kube-apiserver-proxy-ip-10-0-128-208.ec2.internal\" (UID: \"2e0c91486769e79b8cafe9f0bd44a6b4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.133844 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.133807 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.133844 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.133806 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a4917e9ea21d72262d81705b2073ae76-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal\" (UID: \"a4917e9ea21d72262d81705b2073ae76\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.133844 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.133835 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a4917e9ea21d72262d81705b2073ae76-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal\" (UID: \"a4917e9ea21d72262d81705b2073ae76\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.234414 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.234360 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.270549 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.270527 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.275072 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.275058 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal" Apr 22 18:46:29.334899 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.334868 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.435259 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.435238 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.535651 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.535596 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.635975 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.635951 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.641159 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.641141 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:29.641291 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.641276 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:29.641354 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.641327 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:29.728451 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.728416 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:28 +0000 UTC" deadline="2027-09-28 01:34:16.474993989 +0000 UTC" Apr 22 18:46:29.728451 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.728446 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12558h47m46.746550986s" Apr 22 18:46:29.729898 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:29.729869 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e0c91486769e79b8cafe9f0bd44a6b4.slice/crio-c3b5f4d7be2345ca8efc7881a72d80b05f157e0290ce4938fe007ee70bc5da15 WatchSource:0}: Error finding container c3b5f4d7be2345ca8efc7881a72d80b05f157e0290ce4938fe007ee70bc5da15: Status 404 returned error can't find the container with id c3b5f4d7be2345ca8efc7881a72d80b05f157e0290ce4938fe007ee70bc5da15 Apr 22 18:46:29.730434 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:29.730414 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4917e9ea21d72262d81705b2073ae76.slice/crio-ca5ed41fe686f5ab540de9b9175a97ba63b39adb07e2b31b4952b94ed41a99da WatchSource:0}: Error finding container ca5ed41fe686f5ab540de9b9175a97ba63b39adb07e2b31b4952b94ed41a99da: Status 404 returned error can't find the container with id ca5ed41fe686f5ab540de9b9175a97ba63b39adb07e2b31b4952b94ed41a99da Apr 22 18:46:29.731484 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.731470 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:29.736091 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.736065 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.737618 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.737603 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:29.746684 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.746667 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:29.769888 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.769868 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5p9nf" Apr 22 18:46:29.778452 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.778428 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5p9nf" Apr 22 18:46:29.837147 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.837091 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.838599 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.838558 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal" event={"ID":"2e0c91486769e79b8cafe9f0bd44a6b4","Type":"ContainerStarted","Data":"c3b5f4d7be2345ca8efc7881a72d80b05f157e0290ce4938fe007ee70bc5da15"} Apr 22 18:46:29.839428 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.839408 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" event={"ID":"a4917e9ea21d72262d81705b2073ae76","Type":"ContainerStarted","Data":"ca5ed41fe686f5ab540de9b9175a97ba63b39adb07e2b31b4952b94ed41a99da"} Apr 22 18:46:29.937749 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:29.937722 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-208.ec2.internal\" not found" Apr 22 18:46:29.979141 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:29.979120 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:30.031048 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.031027 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" Apr 22 18:46:30.044170 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.044153 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:30.045826 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.045812 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal" Apr 22 18:46:30.055043 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.055024 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:30.245411 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.245119 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:30.497363 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.497288 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:30.594117 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.594085 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:30.711931 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.711897 2562 apiserver.go:52] "Watching apiserver" Apr 22 18:46:30.718316 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.718287 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:30.719527 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.719503 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-9khsc","openshift-dns/node-resolver-sjgbr","openshift-image-registry/node-ca-qssrm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal","openshift-multus/multus-additional-cni-plugins-rwjzd","openshift-network-diagnostics/network-check-target-9g8sf","kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal","openshift-multus/multus-j7np9","openshift-multus/network-metrics-daemon-4vbxg","openshift-network-operator/iptables-alerter-49mmp","openshift-ovn-kubernetes/ovnkube-node-zf6t2","kube-system/konnectivity-agent-m7cgq","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg"] Apr 22 18:46:30.723245 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.723225 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.725693 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.725673 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.725805 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.725758 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:30.725972 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.725952 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.725972 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.725965 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sf2bc\"" Apr 22 18:46:30.726186 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.726160 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.727543 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.727527 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.727846 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.727827 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.728059 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.728046 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:30.728578 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.728352 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:30.728578 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.728359 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.728578 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.728477 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fs825\"" Apr 22 18:46:30.728578 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.728380 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fbpqc\"" Apr 22 18:46:30.728815 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.728582 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:30.729423 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.729407 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.730279 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.730254 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:30.730384 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.730281 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.730384 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.730261 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.730384 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.730362 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zxb2x\"" Apr 22 18:46:30.731942 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.731788 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-rbl2b\"" Apr 22 18:46:30.731942 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.731840 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.731942 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.731896 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.734123 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.734105 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.735869 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.735845 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:30.736281 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.736258 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.736369 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:30.736326 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:30.736512 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.736494 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:30.737215 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.736884 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:30.737215 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.737044 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f6vt7\"" Apr 22 18:46:30.737215 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.737122 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:30.737215 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.737198 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.738940 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.737520 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.738940 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.738589 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:30.739111 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.739078 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-k7lhr\"" Apr 22 18:46:30.740529 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740505 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-modprobe-d\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.740613 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740531 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-host\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.740613 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740548 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-sys-fs\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.740613 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740563 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0-konnectivity-ca\") pod \"konnectivity-agent-m7cgq\" (UID: \"9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0\") " pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:30.740613 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740586 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.740613 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740610 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmtn9\" (UniqueName: \"kubernetes.io/projected/977c5b80-af35-48df-9e61-f66a82bc4f6b-kube-api-access-lmtn9\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.740873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740660 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-sysconfig\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.740873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740686 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-sysctl-conf\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.740873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740705 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-lib-modules\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.740873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740718 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-etc-selinux\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.740873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740737 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bctc\" (UniqueName: \"kubernetes.io/projected/c99d2e9b-5df0-44c6-9f90-b824537af676-kube-api-access-8bctc\") pod \"node-resolver-sjgbr\" (UID: \"c99d2e9b-5df0-44c6-9f90-b824537af676\") " pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.740873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740760 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-socket-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.740873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740786 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-device-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.740873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740802 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjnh4\" (UniqueName: \"kubernetes.io/projected/01634ae5-68d0-4ab8-8b17-7736250c3f31-kube-api-access-fjnh4\") pod \"node-ca-qssrm\" (UID: \"01634ae5-68d0-4ab8-8b17-7736250c3f31\") " pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.740873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740817 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-sysctl-d\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740875 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-systemd\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740907 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-run\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740934 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/01634ae5-68d0-4ab8-8b17-7736250c3f31-serviceca\") pod \"node-ca-qssrm\" (UID: \"01634ae5-68d0-4ab8-8b17-7736250c3f31\") " pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740952 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c99d2e9b-5df0-44c6-9f90-b824537af676-tmp-dir\") pod \"node-resolver-sjgbr\" (UID: \"c99d2e9b-5df0-44c6-9f90-b824537af676\") " pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.740986 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-sys\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741040 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/977c5b80-af35-48df-9e61-f66a82bc4f6b-tmp\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741074 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-system-cni-dir\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741121 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7ff492a4-8166-428f-aa35-d7319e606032-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741147 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stwgg\" (UniqueName: \"kubernetes.io/projected/7ff492a4-8166-428f-aa35-d7319e606032-kube-api-access-stwgg\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741180 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741210 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c99d2e9b-5df0-44c6-9f90-b824537af676-hosts-file\") pod \"node-resolver-sjgbr\" (UID: \"c99d2e9b-5df0-44c6-9f90-b824537af676\") " pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741228 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-registration-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741255 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01634ae5-68d0-4ab8-8b17-7736250c3f31-host\") pod \"node-ca-qssrm\" (UID: \"01634ae5-68d0-4ab8-8b17-7736250c3f31\") " pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741280 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-kubernetes\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.741331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741297 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-var-lib-kubelet\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.742121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741327 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-992rt\" (UniqueName: \"kubernetes.io/projected/977e4b6c-16dc-40ce-90ae-98209ae297a8-kube-api-access-992rt\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.742121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741377 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0-agent-certs\") pod \"konnectivity-agent-m7cgq\" (UID: \"9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0\") " pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:30.742121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741397 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-cnibin\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.742121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741418 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-os-release\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.742121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741433 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ff492a4-8166-428f-aa35-d7319e606032-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.742121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741454 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-tuned\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.742121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.741480 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ff492a4-8166-428f-aa35-d7319e606032-cni-binary-copy\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.742121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.742042 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:30.742121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.742106 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.742536 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:30.742122 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:30.744168 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.744150 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qgzpj\"" Apr 22 18:46:30.744242 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.744225 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:30.744588 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.744574 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.744639 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.744590 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.744834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.744820 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.747008 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.746991 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:30.747353 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.747328 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jtxgh\"" Apr 22 18:46:30.747476 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.747359 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.747476 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.747376 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:30.747476 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.747427 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:30.747639 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.747603 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.747639 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.747609 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:30.779606 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.779583 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:29 +0000 UTC" deadline="2027-12-11 16:38:14.599292271 +0000 UTC" Apr 22 18:46:30.779606 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.779604 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14349h51m43.819690335s" Apr 22 18:46:30.832466 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.832433 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:30.842391 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842363 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-run-openvswitch\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.842519 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842408 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-sysconfig\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.842519 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842435 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-sysctl-conf\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.842519 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842484 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-lib-modules\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.842670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842517 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-etc-selinux\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.842670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842548 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-cni-netd\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.842670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842562 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-sysconfig\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.842670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842575 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-sysctl-conf\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.842670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842590 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-sys\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.842670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842619 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stwgg\" (UniqueName: \"kubernetes.io/projected/7ff492a4-8166-428f-aa35-d7319e606032-kube-api-access-stwgg\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.842670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842660 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-socket-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842681 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842687 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-lib-modules\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842655 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-sys\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842725 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j6hz\" (UniqueName: \"kubernetes.io/projected/df350c2f-f320-4324-915f-7ba97845f4cf-kube-api-access-4j6hz\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842761 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32fb3eb3-1544-4a71-8e35-ca98066a2f14-cni-binary-copy\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842770 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-socket-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842809 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-hostroot\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842866 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842870 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-etc-selinux\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842906 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-run-netns\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842931 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842954 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq6dv\" (UniqueName: \"kubernetes.io/projected/32fb3eb3-1544-4a71-8e35-ca98066a2f14-kube-api-access-nq6dv\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842974 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntkdb\" (UniqueName: \"kubernetes.io/projected/9bebf205-15a5-47cd-a9bd-2fe359cd3118-kube-api-access-ntkdb\") pod \"iptables-alerter-49mmp\" (UID: \"9bebf205-15a5-47cd-a9bd-2fe359cd3118\") " pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.842992 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c99d2e9b-5df0-44c6-9f90-b824537af676-hosts-file\") pod \"node-resolver-sjgbr\" (UID: \"c99d2e9b-5df0-44c6-9f90-b824537af676\") " pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.843028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843030 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-kubelet\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843115 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c99d2e9b-5df0-44c6-9f90-b824537af676-hosts-file\") pod \"node-resolver-sjgbr\" (UID: \"c99d2e9b-5df0-44c6-9f90-b824537af676\") " pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843136 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-run-netns\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843165 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843187 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df350c2f-f320-4324-915f-7ba97845f4cf-env-overrides\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843210 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-cni-dir\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843232 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-var-lib-cni-multus\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843255 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-etc-kubernetes\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843281 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01634ae5-68d0-4ab8-8b17-7736250c3f31-host\") pod \"node-ca-qssrm\" (UID: \"01634ae5-68d0-4ab8-8b17-7736250c3f31\") " pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843305 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0-agent-certs\") pod \"konnectivity-agent-m7cgq\" (UID: \"9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0\") " pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843344 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/01634ae5-68d0-4ab8-8b17-7736250c3f31-host\") pod \"node-ca-qssrm\" (UID: \"01634ae5-68d0-4ab8-8b17-7736250c3f31\") " pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843322 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-os-release\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843392 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ff492a4-8166-428f-aa35-d7319e606032-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843402 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-os-release\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843418 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ff492a4-8166-428f-aa35-d7319e606032-cni-binary-copy\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843447 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g6nc\" (UniqueName: \"kubernetes.io/projected/da9bff9a-df34-4fcf-9338-631fbb086e31-kube-api-access-9g6nc\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843469 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-var-lib-cni-bin\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.843647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843493 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-modprobe-d\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843596 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-sys-fs\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843606 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-modprobe-d\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843653 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843700 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-run-ovn\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843729 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-conf-dir\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843700 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-sys-fs\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843754 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-slash\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843779 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-socket-dir-parent\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843784 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843810 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmtn9\" (UniqueName: \"kubernetes.io/projected/977c5b80-af35-48df-9e61-f66a82bc4f6b-kube-api-access-lmtn9\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843849 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-etc-openvswitch\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843855 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843880 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df350c2f-f320-4324-915f-7ba97845f4cf-ovnkube-config\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843989 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ff492a4-8166-428f-aa35-d7319e606032-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844007 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ff492a4-8166-428f-aa35-d7319e606032-cni-binary-copy\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.843994 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/977c5b80-af35-48df-9e61-f66a82bc4f6b-tmp\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.844638 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844102 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-system-cni-dir\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844130 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bctc\" (UniqueName: \"kubernetes.io/projected/c99d2e9b-5df0-44c6-9f90-b824537af676-kube-api-access-8bctc\") pod \"node-resolver-sjgbr\" (UID: \"c99d2e9b-5df0-44c6-9f90-b824537af676\") " pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844152 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-device-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844163 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-system-cni-dir\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844173 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjnh4\" (UniqueName: \"kubernetes.io/projected/01634ae5-68d0-4ab8-8b17-7736250c3f31-kube-api-access-fjnh4\") pod \"node-ca-qssrm\" (UID: \"01634ae5-68d0-4ab8-8b17-7736250c3f31\") " pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844211 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-device-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844220 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-sysctl-d\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844245 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-systemd\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844269 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-run\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844294 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/01634ae5-68d0-4ab8-8b17-7736250c3f31-serviceca\") pod \"node-ca-qssrm\" (UID: \"01634ae5-68d0-4ab8-8b17-7736250c3f31\") " pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844336 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c99d2e9b-5df0-44c6-9f90-b824537af676-tmp-dir\") pod \"node-resolver-sjgbr\" (UID: \"c99d2e9b-5df0-44c6-9f90-b824537af676\") " pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844344 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-systemd\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844366 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-sysctl-d\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844385 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-run\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844417 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7ff492a4-8166-428f-aa35-d7319e606032-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844449 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-run-k8s-cni-cncf-io\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844476 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:30.845582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844502 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-registration-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844546 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/df350c2f-f320-4324-915f-7ba97845f4cf-ovnkube-script-lib\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844557 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/977e4b6c-16dc-40ce-90ae-98209ae297a8-registration-dir\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844583 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-cnibin\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844613 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-kubernetes\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844629 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c99d2e9b-5df0-44c6-9f90-b824537af676-tmp-dir\") pod \"node-resolver-sjgbr\" (UID: \"c99d2e9b-5df0-44c6-9f90-b824537af676\") " pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844642 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-var-lib-kubelet\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844669 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-992rt\" (UniqueName: \"kubernetes.io/projected/977e4b6c-16dc-40ce-90ae-98209ae297a8-kube-api-access-992rt\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844694 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-cnibin\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844718 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-node-log\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844744 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-cni-bin\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844750 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/01634ae5-68d0-4ab8-8b17-7736250c3f31-serviceca\") pod \"node-ca-qssrm\" (UID: \"01634ae5-68d0-4ab8-8b17-7736250c3f31\") " pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844719 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-kubernetes\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844769 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ff492a4-8166-428f-aa35-d7319e606032-cnibin\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844769 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-os-release\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844812 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-tuned\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844830 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-var-lib-kubelet\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.846399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844867 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-systemd-units\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844883 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7ff492a4-8166-428f-aa35-d7319e606032-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844921 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-run-systemd\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844941 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-var-lib-openvswitch\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844955 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df350c2f-f320-4324-915f-7ba97845f4cf-ovn-node-metrics-cert\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.844993 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-daemon-config\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845044 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-run-multus-certs\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845074 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-host\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845101 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0-konnectivity-ca\") pod \"konnectivity-agent-m7cgq\" (UID: \"9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0\") " pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845128 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9bebf205-15a5-47cd-a9bd-2fe359cd3118-iptables-alerter-script\") pod \"iptables-alerter-49mmp\" (UID: \"9bebf205-15a5-47cd-a9bd-2fe359cd3118\") " pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845143 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/977c5b80-af35-48df-9e61-f66a82bc4f6b-host\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845152 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845174 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-log-socket\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845188 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-system-cni-dir\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845210 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-var-lib-kubelet\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845231 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bebf205-15a5-47cd-a9bd-2fe359cd3118-host-slash\") pod \"iptables-alerter-49mmp\" (UID: \"9bebf205-15a5-47cd-a9bd-2fe359cd3118\") " pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.847188 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.845613 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0-konnectivity-ca\") pod \"konnectivity-agent-m7cgq\" (UID: \"9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0\") " pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:30.847809 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.847193 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/977c5b80-af35-48df-9e61-f66a82bc4f6b-etc-tuned\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.847809 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.847269 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/977c5b80-af35-48df-9e61-f66a82bc4f6b-tmp\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.847809 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.847705 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0-agent-certs\") pod \"konnectivity-agent-m7cgq\" (UID: \"9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0\") " pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:30.849628 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:30.849599 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:30.849731 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:30.849635 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:30.849731 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:30.849652 2562 projected.go:194] Error preparing data for projected volume kube-api-access-47h8s for pod openshift-network-diagnostics/network-check-target-9g8sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:30.849731 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:30.849727 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s podName:0b614b94-c0eb-4f39-add0-c46922389f94 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:31.349691012 +0000 UTC m=+3.028917293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-47h8s" (UniqueName: "kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s") pod "network-check-target-9g8sf" (UID: "0b614b94-c0eb-4f39-add0-c46922389f94") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:30.851622 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.851544 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmtn9\" (UniqueName: \"kubernetes.io/projected/977c5b80-af35-48df-9e61-f66a82bc4f6b-kube-api-access-lmtn9\") pod \"tuned-9khsc\" (UID: \"977c5b80-af35-48df-9e61-f66a82bc4f6b\") " pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:30.852132 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.851943 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stwgg\" (UniqueName: \"kubernetes.io/projected/7ff492a4-8166-428f-aa35-d7319e606032-kube-api-access-stwgg\") pod \"multus-additional-cni-plugins-rwjzd\" (UID: \"7ff492a4-8166-428f-aa35-d7319e606032\") " pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:30.852132 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.851942 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bctc\" (UniqueName: \"kubernetes.io/projected/c99d2e9b-5df0-44c6-9f90-b824537af676-kube-api-access-8bctc\") pod \"node-resolver-sjgbr\" (UID: \"c99d2e9b-5df0-44c6-9f90-b824537af676\") " pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:30.852132 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.852091 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-992rt\" (UniqueName: \"kubernetes.io/projected/977e4b6c-16dc-40ce-90ae-98209ae297a8-kube-api-access-992rt\") pod \"aws-ebs-csi-driver-node-2bqlg\" (UID: \"977e4b6c-16dc-40ce-90ae-98209ae297a8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:30.852576 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.852556 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjnh4\" (UniqueName: \"kubernetes.io/projected/01634ae5-68d0-4ab8-8b17-7736250c3f31-kube-api-access-fjnh4\") pod \"node-ca-qssrm\" (UID: \"01634ae5-68d0-4ab8-8b17-7736250c3f31\") " pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:30.946530 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946500 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-node-log\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.946705 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946543 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-cni-bin\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.946705 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946579 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-os-release\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.946705 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946605 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-systemd-units\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.946705 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946606 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-node-log\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.946705 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946646 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-run-systemd\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.946705 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946671 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-var-lib-openvswitch\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.946705 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946698 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df350c2f-f320-4324-915f-7ba97845f4cf-ovn-node-metrics-cert\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946714 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-run-systemd\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946722 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-daemon-config\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946757 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-os-release\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946776 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-run-multus-certs\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946802 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-systemd-units\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946809 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9bebf205-15a5-47cd-a9bd-2fe359cd3118-iptables-alerter-script\") pod \"iptables-alerter-49mmp\" (UID: \"9bebf205-15a5-47cd-a9bd-2fe359cd3118\") " pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946836 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946842 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-run-multus-certs\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946870 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-log-socket\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946898 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-system-cni-dir\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946923 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-var-lib-kubelet\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:30.946935 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946948 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bebf205-15a5-47cd-a9bd-2fe359cd3118-host-slash\") pod \"iptables-alerter-49mmp\" (UID: \"9bebf205-15a5-47cd-a9bd-2fe359cd3118\") " pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946972 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-run-openvswitch\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:30.947004 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs podName:da9bff9a-df34-4fcf-9338-631fbb086e31 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:31.44698137 +0000 UTC m=+3.126207660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs") pod "network-metrics-daemon-4vbxg" (UID: "da9bff9a-df34-4fcf-9338-631fbb086e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:30.947057 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947007 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-run-openvswitch\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947073 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-log-socket\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947075 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-system-cni-dir\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947103 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-var-lib-kubelet\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947139 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9bebf205-15a5-47cd-a9bd-2fe359cd3118-host-slash\") pod \"iptables-alerter-49mmp\" (UID: \"9bebf205-15a5-47cd-a9bd-2fe359cd3118\") " pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947176 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-cni-netd\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947223 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-cni-netd\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947263 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.946670 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-cni-bin\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947311 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-daemon-config\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947338 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947354 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9bebf205-15a5-47cd-a9bd-2fe359cd3118-iptables-alerter-script\") pod \"iptables-alerter-49mmp\" (UID: \"9bebf205-15a5-47cd-a9bd-2fe359cd3118\") " pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947300 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4j6hz\" (UniqueName: \"kubernetes.io/projected/df350c2f-f320-4324-915f-7ba97845f4cf-kube-api-access-4j6hz\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947394 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-var-lib-openvswitch\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947405 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32fb3eb3-1544-4a71-8e35-ca98066a2f14-cni-binary-copy\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947432 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-hostroot\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947482 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-run-netns\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.947804 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947508 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq6dv\" (UniqueName: \"kubernetes.io/projected/32fb3eb3-1544-4a71-8e35-ca98066a2f14-kube-api-access-nq6dv\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947533 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntkdb\" (UniqueName: \"kubernetes.io/projected/9bebf205-15a5-47cd-a9bd-2fe359cd3118-kube-api-access-ntkdb\") pod \"iptables-alerter-49mmp\" (UID: \"9bebf205-15a5-47cd-a9bd-2fe359cd3118\") " pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947548 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-hostroot\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947556 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-kubelet\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947579 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-run-netns\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947600 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947635 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df350c2f-f320-4324-915f-7ba97845f4cf-env-overrides\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947659 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-cni-dir\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947685 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-var-lib-cni-multus\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947709 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-etc-kubernetes\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947749 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-run-netns\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947764 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9g6nc\" (UniqueName: \"kubernetes.io/projected/da9bff9a-df34-4fcf-9338-631fbb086e31-kube-api-access-9g6nc\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947789 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-var-lib-cni-bin\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947796 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-run-netns\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947820 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-run-ovn\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947834 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-cni-dir\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947846 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-conf-dir\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947876 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-slash\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.948577 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947905 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-socket-dir-parent\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947932 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-etc-openvswitch\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947956 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df350c2f-f320-4324-915f-7ba97845f4cf-ovnkube-config\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.947991 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-run-k8s-cni-cncf-io\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948047 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/df350c2f-f320-4324-915f-7ba97845f4cf-ovnkube-script-lib\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948070 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-cnibin\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948135 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32fb3eb3-1544-4a71-8e35-ca98066a2f14-cni-binary-copy\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948150 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-cnibin\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948197 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-kubelet\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948204 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-var-lib-cni-multus\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948231 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948238 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-host-slash\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948266 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-etc-kubernetes\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948283 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-socket-dir-parent\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.949417 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948466 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-var-lib-cni-bin\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.950052 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948502 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-run-ovn\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.950052 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948535 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-multus-conf-dir\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.950052 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948566 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/df350c2f-f320-4324-915f-7ba97845f4cf-etc-openvswitch\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.950052 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948602 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32fb3eb3-1544-4a71-8e35-ca98066a2f14-host-run-k8s-cni-cncf-io\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.950052 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948718 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/df350c2f-f320-4324-915f-7ba97845f4cf-ovnkube-config\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.950052 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.948826 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/df350c2f-f320-4324-915f-7ba97845f4cf-env-overrides\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.950052 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.949086 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/df350c2f-f320-4324-915f-7ba97845f4cf-ovnkube-script-lib\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.950909 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.950889 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/df350c2f-f320-4324-915f-7ba97845f4cf-ovn-node-metrics-cert\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.957459 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.957438 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq6dv\" (UniqueName: \"kubernetes.io/projected/32fb3eb3-1544-4a71-8e35-ca98066a2f14-kube-api-access-nq6dv\") pod \"multus-j7np9\" (UID: \"32fb3eb3-1544-4a71-8e35-ca98066a2f14\") " pod="openshift-multus/multus-j7np9" Apr 22 18:46:30.958248 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.958228 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j6hz\" (UniqueName: \"kubernetes.io/projected/df350c2f-f320-4324-915f-7ba97845f4cf-kube-api-access-4j6hz\") pod \"ovnkube-node-zf6t2\" (UID: \"df350c2f-f320-4324-915f-7ba97845f4cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:30.959200 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.959182 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntkdb\" (UniqueName: \"kubernetes.io/projected/9bebf205-15a5-47cd-a9bd-2fe359cd3118-kube-api-access-ntkdb\") pod \"iptables-alerter-49mmp\" (UID: \"9bebf205-15a5-47cd-a9bd-2fe359cd3118\") " pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:30.959455 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:30.959436 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g6nc\" (UniqueName: \"kubernetes.io/projected/da9bff9a-df34-4fcf-9338-631fbb086e31-kube-api-access-9g6nc\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:31.035328 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.035256 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sjgbr" Apr 22 18:46:31.044066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.044039 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" Apr 22 18:46:31.051815 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.051797 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:31.056075 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.056060 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qssrm" Apr 22 18:46:31.061570 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.061551 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9khsc" Apr 22 18:46:31.067109 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.067084 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" Apr 22 18:46:31.074632 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.074611 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j7np9" Apr 22 18:46:31.081149 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.081132 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-49mmp" Apr 22 18:46:31.085483 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.085461 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:31.312397 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:31.312370 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod977e4b6c_16dc_40ce_90ae_98209ae297a8.slice/crio-46c1653f9d54e2762138d3108ef91ec8f377e8cc0338c8d499a25fd5c33341f7 WatchSource:0}: Error finding container 46c1653f9d54e2762138d3108ef91ec8f377e8cc0338c8d499a25fd5c33341f7: Status 404 returned error can't find the container with id 46c1653f9d54e2762138d3108ef91ec8f377e8cc0338c8d499a25fd5c33341f7 Apr 22 18:46:31.314000 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:31.313977 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod977c5b80_af35_48df_9e61_f66a82bc4f6b.slice/crio-f31172654db4258c61a81753708f0f3d410d5f8efe6e1e1f14125dfec314a2f8 WatchSource:0}: Error finding container f31172654db4258c61a81753708f0f3d410d5f8efe6e1e1f14125dfec314a2f8: Status 404 returned error can't find the container with id f31172654db4258c61a81753708f0f3d410d5f8efe6e1e1f14125dfec314a2f8 Apr 22 18:46:31.315877 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:31.315739 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bebf205_15a5_47cd_a9bd_2fe359cd3118.slice/crio-9859c1ad81c782142d9a808c8d6c9ee68403d7a128c7c0241ec277e62a7673ff WatchSource:0}: Error finding container 9859c1ad81c782142d9a808c8d6c9ee68403d7a128c7c0241ec277e62a7673ff: Status 404 returned error can't find the container with id 9859c1ad81c782142d9a808c8d6c9ee68403d7a128c7c0241ec277e62a7673ff Apr 22 18:46:31.316729 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:31.316705 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01634ae5_68d0_4ab8_8b17_7736250c3f31.slice/crio-c46640a78cef8a2bb9e84df3a624205352f4d2fac2a1d6342bbd8e717d72f0e6 WatchSource:0}: Error finding container c46640a78cef8a2bb9e84df3a624205352f4d2fac2a1d6342bbd8e717d72f0e6: Status 404 returned error can't find the container with id c46640a78cef8a2bb9e84df3a624205352f4d2fac2a1d6342bbd8e717d72f0e6 Apr 22 18:46:31.318483 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:31.318309 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32fb3eb3_1544_4a71_8e35_ca98066a2f14.slice/crio-3ef3250e23be4df58cb263937706cfbd0cd27cfd934337a393d164e66d32dbb5 WatchSource:0}: Error finding container 3ef3250e23be4df58cb263937706cfbd0cd27cfd934337a393d164e66d32dbb5: Status 404 returned error can't find the container with id 3ef3250e23be4df58cb263937706cfbd0cd27cfd934337a393d164e66d32dbb5 Apr 22 18:46:31.321880 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:31.321854 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff492a4_8166_428f_aa35_d7319e606032.slice/crio-96a499ead7a529714e93151fdc013be6095c1a05aa67c1a7ecd89ac9fa60f648 WatchSource:0}: Error finding container 96a499ead7a529714e93151fdc013be6095c1a05aa67c1a7ecd89ac9fa60f648: Status 404 returned error can't find the container with id 96a499ead7a529714e93151fdc013be6095c1a05aa67c1a7ecd89ac9fa60f648 Apr 22 18:46:31.324166 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:31.324090 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd73bb8_d0cb_40bb_828c_febdb1ce4ac0.slice/crio-fd50a930870bf9e8b1ff77fc3640c0c156c8aa1ab75ed05d3a1968eebff74946 WatchSource:0}: Error finding container fd50a930870bf9e8b1ff77fc3640c0c156c8aa1ab75ed05d3a1968eebff74946: Status 404 returned error can't find the container with id fd50a930870bf9e8b1ff77fc3640c0c156c8aa1ab75ed05d3a1968eebff74946 Apr 22 18:46:31.325268 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:31.324570 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf350c2f_f320_4324_915f_7ba97845f4cf.slice/crio-2d0132676b23270e67867203984416eb6204bb2ca29d8d5f02b4e7d5655b2334 WatchSource:0}: Error finding container 2d0132676b23270e67867203984416eb6204bb2ca29d8d5f02b4e7d5655b2334: Status 404 returned error can't find the container with id 2d0132676b23270e67867203984416eb6204bb2ca29d8d5f02b4e7d5655b2334 Apr 22 18:46:31.325268 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:46:31.325119 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc99d2e9b_5df0_44c6_9f90_b824537af676.slice/crio-f4f2c3aaf0c2abc8e42f25f665295f4c2d1c4283ea25f79fb43444cebf42df5d WatchSource:0}: Error finding container f4f2c3aaf0c2abc8e42f25f665295f4c2d1c4283ea25f79fb43444cebf42df5d: Status 404 returned error can't find the container with id f4f2c3aaf0c2abc8e42f25f665295f4c2d1c4283ea25f79fb43444cebf42df5d Apr 22 18:46:31.350823 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.350802 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:31.350918 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:31.350907 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:31.350965 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:31.350922 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:31.350965 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:31.350931 2562 projected.go:194] Error preparing data for projected volume kube-api-access-47h8s for pod openshift-network-diagnostics/network-check-target-9g8sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:31.351057 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:31.350985 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s podName:0b614b94-c0eb-4f39-add0-c46922389f94 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:32.350965181 +0000 UTC m=+4.030191464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-47h8s" (UniqueName: "kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s") pod "network-check-target-9g8sf" (UID: "0b614b94-c0eb-4f39-add0-c46922389f94") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:31.451481 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.451457 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:31.451573 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:31.451555 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:31.451617 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:31.451597 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs podName:da9bff9a-df34-4fcf-9338-631fbb086e31 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:32.451583215 +0000 UTC m=+4.130809485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs") pod "network-metrics-daemon-4vbxg" (UID: "da9bff9a-df34-4fcf-9338-631fbb086e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:31.780715 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.780050 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:29 +0000 UTC" deadline="2027-11-13 04:21:09.385408254 +0000 UTC" Apr 22 18:46:31.780715 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.780150 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13665h34m37.605262842s" Apr 22 18:46:31.850365 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.850331 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-49mmp" event={"ID":"9bebf205-15a5-47cd-a9bd-2fe359cd3118","Type":"ContainerStarted","Data":"9859c1ad81c782142d9a808c8d6c9ee68403d7a128c7c0241ec277e62a7673ff"} Apr 22 18:46:31.862612 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.862585 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j7np9" event={"ID":"32fb3eb3-1544-4a71-8e35-ca98066a2f14","Type":"ContainerStarted","Data":"3ef3250e23be4df58cb263937706cfbd0cd27cfd934337a393d164e66d32dbb5"} Apr 22 18:46:31.869154 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.869118 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sjgbr" event={"ID":"c99d2e9b-5df0-44c6-9f90-b824537af676","Type":"ContainerStarted","Data":"f4f2c3aaf0c2abc8e42f25f665295f4c2d1c4283ea25f79fb43444cebf42df5d"} Apr 22 18:46:31.871531 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.871506 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m7cgq" event={"ID":"9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0","Type":"ContainerStarted","Data":"fd50a930870bf9e8b1ff77fc3640c0c156c8aa1ab75ed05d3a1968eebff74946"} Apr 22 18:46:31.874794 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.874761 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qssrm" event={"ID":"01634ae5-68d0-4ab8-8b17-7736250c3f31","Type":"ContainerStarted","Data":"c46640a78cef8a2bb9e84df3a624205352f4d2fac2a1d6342bbd8e717d72f0e6"} Apr 22 18:46:31.876338 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.876282 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9khsc" event={"ID":"977c5b80-af35-48df-9e61-f66a82bc4f6b","Type":"ContainerStarted","Data":"f31172654db4258c61a81753708f0f3d410d5f8efe6e1e1f14125dfec314a2f8"} Apr 22 18:46:31.886053 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.885989 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" event={"ID":"977e4b6c-16dc-40ce-90ae-98209ae297a8","Type":"ContainerStarted","Data":"46c1653f9d54e2762138d3108ef91ec8f377e8cc0338c8d499a25fd5c33341f7"} Apr 22 18:46:31.891046 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.890574 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal" event={"ID":"2e0c91486769e79b8cafe9f0bd44a6b4","Type":"ContainerStarted","Data":"0d4e8d44fa0f82773a8838ad5ef4e25385cd23c1adbbdcd527372a3f5db38d39"} Apr 22 18:46:31.902754 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.902706 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" event={"ID":"df350c2f-f320-4324-915f-7ba97845f4cf","Type":"ContainerStarted","Data":"2d0132676b23270e67867203984416eb6204bb2ca29d8d5f02b4e7d5655b2334"} Apr 22 18:46:31.906131 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.905790 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-208.ec2.internal" podStartSLOduration=1.905775723 podStartE2EDuration="1.905775723s" podCreationTimestamp="2026-04-22 18:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:31.904675961 +0000 UTC m=+3.583902252" watchObservedRunningTime="2026-04-22 18:46:31.905775723 +0000 UTC m=+3.585002017" Apr 22 18:46:31.908951 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:31.908295 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" event={"ID":"7ff492a4-8166-428f-aa35-d7319e606032","Type":"ContainerStarted","Data":"96a499ead7a529714e93151fdc013be6095c1a05aa67c1a7ecd89ac9fa60f648"} Apr 22 18:46:32.359073 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:32.358971 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:32.359248 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:32.359201 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:32.359248 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:32.359222 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:32.359248 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:32.359235 2562 projected.go:194] Error preparing data for projected volume kube-api-access-47h8s for pod openshift-network-diagnostics/network-check-target-9g8sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:32.359403 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:32.359294 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s podName:0b614b94-c0eb-4f39-add0-c46922389f94 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:34.359275482 +0000 UTC m=+6.038501762 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-47h8s" (UniqueName: "kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s") pod "network-check-target-9g8sf" (UID: "0b614b94-c0eb-4f39-add0-c46922389f94") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:32.460177 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:32.460138 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:32.460362 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:32.460281 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:32.460430 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:32.460368 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs podName:da9bff9a-df34-4fcf-9338-631fbb086e31 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:34.460324686 +0000 UTC m=+6.139550961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs") pod "network-metrics-daemon-4vbxg" (UID: "da9bff9a-df34-4fcf-9338-631fbb086e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:32.838101 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:32.837004 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:32.838101 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:32.837102 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:32.838101 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:32.837267 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:32.838101 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:32.837696 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:32.930427 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:32.930390 2562 generic.go:358] "Generic (PLEG): container finished" podID="a4917e9ea21d72262d81705b2073ae76" containerID="ddcd29f4f50f47a3ca218296773280da35e98353c177b6c409afcaccd0cfe277" exitCode=0 Apr 22 18:46:32.931353 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:32.931278 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" event={"ID":"a4917e9ea21d72262d81705b2073ae76","Type":"ContainerDied","Data":"ddcd29f4f50f47a3ca218296773280da35e98353c177b6c409afcaccd0cfe277"} Apr 22 18:46:33.955047 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:33.954454 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" event={"ID":"a4917e9ea21d72262d81705b2073ae76","Type":"ContainerStarted","Data":"e434f7320019ada495274a8ca24ae588f405e20531411326f09012438b568df9"} Apr 22 18:46:34.376631 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:34.376539 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:34.376801 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:34.376713 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:34.376801 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:34.376735 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:34.376801 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:34.376747 2562 projected.go:194] Error preparing data for projected volume kube-api-access-47h8s for pod openshift-network-diagnostics/network-check-target-9g8sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:34.376948 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:34.376803 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s podName:0b614b94-c0eb-4f39-add0-c46922389f94 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:38.376784869 +0000 UTC m=+10.056011154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-47h8s" (UniqueName: "kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s") pod "network-check-target-9g8sf" (UID: "0b614b94-c0eb-4f39-add0-c46922389f94") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:34.477101 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:34.477060 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:34.477290 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:34.477190 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:34.477290 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:34.477284 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs podName:da9bff9a-df34-4fcf-9338-631fbb086e31 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:38.47726212 +0000 UTC m=+10.156488402 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs") pod "network-metrics-daemon-4vbxg" (UID: "da9bff9a-df34-4fcf-9338-631fbb086e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:34.837756 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:34.837722 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:34.837978 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:34.837857 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:34.838395 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:34.838373 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:34.838507 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:34.838483 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:36.837040 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:36.836818 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:36.837040 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:36.836950 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:36.837543 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:36.837067 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:36.837543 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:36.837208 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:38.411643 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:38.411602 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:38.412126 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:38.411796 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:38.412126 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:38.411826 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:38.412126 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:38.411840 2562 projected.go:194] Error preparing data for projected volume kube-api-access-47h8s for pod openshift-network-diagnostics/network-check-target-9g8sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:38.412126 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:38.411903 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s podName:0b614b94-c0eb-4f39-add0-c46922389f94 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.411884057 +0000 UTC m=+18.091110331 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-47h8s" (UniqueName: "kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s") pod "network-check-target-9g8sf" (UID: "0b614b94-c0eb-4f39-add0-c46922389f94") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:38.512528 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:38.512482 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:38.512688 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:38.512668 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:38.512763 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:38.512751 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs podName:da9bff9a-df34-4fcf-9338-631fbb086e31 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.512730417 +0000 UTC m=+18.191956698 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs") pod "network-metrics-daemon-4vbxg" (UID: "da9bff9a-df34-4fcf-9338-631fbb086e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:38.837103 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:38.837068 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:38.837289 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:38.837180 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:38.837366 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:38.837322 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:38.837450 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:38.837423 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:40.836598 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:40.836561 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:40.836598 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:40.836590 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:40.837094 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:40.836712 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:40.837094 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:40.836857 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:42.839578 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:42.839547 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:42.839578 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:42.839579 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:42.840188 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:42.839679 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:42.840188 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:42.839821 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:44.839394 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:44.839367 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:44.839824 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:44.839367 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:44.839824 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:44.839479 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:44.839824 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:44.839570 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:46.476311 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:46.476224 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:46.476741 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:46.476409 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:46.476741 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:46.476431 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:46.476741 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:46.476441 2562 projected.go:194] Error preparing data for projected volume kube-api-access-47h8s for pod openshift-network-diagnostics/network-check-target-9g8sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.476741 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:46.476496 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s podName:0b614b94-c0eb-4f39-add0-c46922389f94 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.476480248 +0000 UTC m=+34.155706523 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-47h8s" (UniqueName: "kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s") pod "network-check-target-9g8sf" (UID: "0b614b94-c0eb-4f39-add0-c46922389f94") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.576557 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:46.576528 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:46.576718 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:46.576692 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.576778 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:46.576767 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs podName:da9bff9a-df34-4fcf-9338-631fbb086e31 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.576746049 +0000 UTC m=+34.255972338 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs") pod "network-metrics-daemon-4vbxg" (UID: "da9bff9a-df34-4fcf-9338-631fbb086e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.836487 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:46.836411 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:46.836647 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:46.836532 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:46.836647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:46.836620 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:46.836764 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:46.836733 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:48.842048 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:48.840309 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:48.842048 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:48.840590 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:48.842048 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:48.840681 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:48.842048 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:48.840752 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:48.980899 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:48.980677 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sjgbr" event={"ID":"c99d2e9b-5df0-44c6-9f90-b824537af676","Type":"ContainerStarted","Data":"669dc633d36dcd89417414127fdb743665ca01670a513c75e21083757eb39015"} Apr 22 18:46:48.982721 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:48.982690 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-m7cgq" event={"ID":"9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0","Type":"ContainerStarted","Data":"4114e9249af2ee3c7d64430517b37a58babf8d338840a0513ac75de62cd5834a"} Apr 22 18:46:48.985559 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:48.985240 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qssrm" event={"ID":"01634ae5-68d0-4ab8-8b17-7736250c3f31","Type":"ContainerStarted","Data":"a0a461946b580a503f6042f4021336fbd5f8aa57bcb7a3fbfc806d299f66cad6"} Apr 22 18:46:48.987194 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:48.987034 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9khsc" event={"ID":"977c5b80-af35-48df-9e61-f66a82bc4f6b","Type":"ContainerStarted","Data":"9ef3591aa727de9f10f55ddb6286c2415ade57c6d2b88e8c6ce176491cc79592"} Apr 22 18:46:48.996134 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:48.996108 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j7np9" event={"ID":"32fb3eb3-1544-4a71-8e35-ca98066a2f14","Type":"ContainerStarted","Data":"44cede3f90bb98767120070364c424f677dfba4159de82a39f7be4261a228d98"} Apr 22 18:46:48.997770 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:48.997722 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sjgbr" podStartSLOduration=3.692204759 podStartE2EDuration="20.997706046s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.327891218 +0000 UTC m=+3.007117488" lastFinishedPulling="2026-04-22 18:46:48.633392496 +0000 UTC m=+20.312618775" observedRunningTime="2026-04-22 18:46:48.997435042 +0000 UTC m=+20.676661360" watchObservedRunningTime="2026-04-22 18:46:48.997706046 +0000 UTC m=+20.676932338" Apr 22 18:46:48.997991 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:48.997962 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-208.ec2.internal" podStartSLOduration=18.997953534 podStartE2EDuration="18.997953534s" podCreationTimestamp="2026-04-22 18:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:33.969305238 +0000 UTC m=+5.648531553" watchObservedRunningTime="2026-04-22 18:46:48.997953534 +0000 UTC m=+20.677179827" Apr 22 18:46:49.014317 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:49.014265 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9khsc" podStartSLOduration=2.675680581 podStartE2EDuration="20.014247583s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.315530479 +0000 UTC m=+2.994756749" lastFinishedPulling="2026-04-22 18:46:48.654097468 +0000 UTC m=+20.333323751" observedRunningTime="2026-04-22 18:46:49.013323757 +0000 UTC m=+20.692550051" watchObservedRunningTime="2026-04-22 18:46:49.014247583 +0000 UTC m=+20.693473875" Apr 22 18:46:49.058857 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:49.058807 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-m7cgq" podStartSLOduration=11.159312998 podStartE2EDuration="20.058792079s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.325806766 +0000 UTC m=+3.005033051" lastFinishedPulling="2026-04-22 18:46:40.225285862 +0000 UTC m=+11.904512132" observedRunningTime="2026-04-22 18:46:49.045580878 +0000 UTC m=+20.724807450" watchObservedRunningTime="2026-04-22 18:46:49.058792079 +0000 UTC m=+20.738018420" Apr 22 18:46:49.059162 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:49.059119 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qssrm" podStartSLOduration=2.724871503 podStartE2EDuration="20.059107291s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.319097591 +0000 UTC m=+2.998323867" lastFinishedPulling="2026-04-22 18:46:48.653333385 +0000 UTC m=+20.332559655" observedRunningTime="2026-04-22 18:46:49.058999746 +0000 UTC m=+20.738226040" watchObservedRunningTime="2026-04-22 18:46:49.059107291 +0000 UTC m=+20.738333585" Apr 22 18:46:50.001099 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.000864 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" event={"ID":"df350c2f-f320-4324-915f-7ba97845f4cf","Type":"ContainerStarted","Data":"3471bc041c0101b31ee34d44a165cda2203cde4db1a1a47a9d77ed3f3ee50f65"} Apr 22 18:46:50.001561 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.001105 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" event={"ID":"df350c2f-f320-4324-915f-7ba97845f4cf","Type":"ContainerStarted","Data":"882069a5e517fe838e3c516d5436df7eb923ba989ef538bec953c4c5f1d37065"} Apr 22 18:46:50.001561 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.001122 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" event={"ID":"df350c2f-f320-4324-915f-7ba97845f4cf","Type":"ContainerStarted","Data":"76247b37837ebba93beb9990d4c144c77075bb14e29875fde734359521e6eed2"} Apr 22 18:46:50.001561 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.001135 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" event={"ID":"df350c2f-f320-4324-915f-7ba97845f4cf","Type":"ContainerStarted","Data":"d11d4dd74d7f9d55a89b4ecbe91a2106d9edf00fca6adde953f575a0ee7db8a8"} Apr 22 18:46:50.001561 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.001149 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" event={"ID":"df350c2f-f320-4324-915f-7ba97845f4cf","Type":"ContainerStarted","Data":"bfd0348e6a27a5c2c9402893e943673c686d57bad1e20e11a98da7ee51821f5c"} Apr 22 18:46:50.001561 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.001160 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" event={"ID":"df350c2f-f320-4324-915f-7ba97845f4cf","Type":"ContainerStarted","Data":"89dd8ab83daa2876261aaf29af764f18f79cda760d7549c8cdf34383b06139e2"} Apr 22 18:46:50.002397 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.002370 2562 generic.go:358] "Generic (PLEG): container finished" podID="7ff492a4-8166-428f-aa35-d7319e606032" containerID="5efbf0658ff82a0415411777e52b24fbe834f17558fd95d6f102af7cdf72401d" exitCode=0 Apr 22 18:46:50.002498 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.002425 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" event={"ID":"7ff492a4-8166-428f-aa35-d7319e606032","Type":"ContainerDied","Data":"5efbf0658ff82a0415411777e52b24fbe834f17558fd95d6f102af7cdf72401d"} Apr 22 18:46:50.005345 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.005317 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-49mmp" event={"ID":"9bebf205-15a5-47cd-a9bd-2fe359cd3118","Type":"ContainerStarted","Data":"4bc29dc72b84131242db88d197530cec43659aad9aac69f75ddc1f96eb5e5d7c"} Apr 22 18:46:50.007102 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.007080 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" event={"ID":"977e4b6c-16dc-40ce-90ae-98209ae297a8","Type":"ContainerStarted","Data":"eea357eb3d7b46af70388bea4a7a8314156ebaec2289da44c4d3e1bf220d8890"} Apr 22 18:46:50.024869 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.024833 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-j7np9" podStartSLOduration=3.679817297 podStartE2EDuration="21.024823549s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.320748751 +0000 UTC m=+2.999975024" lastFinishedPulling="2026-04-22 18:46:48.665754995 +0000 UTC m=+20.344981276" observedRunningTime="2026-04-22 18:46:49.075162714 +0000 UTC m=+20.754389005" watchObservedRunningTime="2026-04-22 18:46:50.024823549 +0000 UTC m=+21.704049838" Apr 22 18:46:50.038627 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.038576 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-49mmp" podStartSLOduration=4.109851774 podStartE2EDuration="21.038561196s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.318703684 +0000 UTC m=+2.997929970" lastFinishedPulling="2026-04-22 18:46:48.247413104 +0000 UTC m=+19.926639392" observedRunningTime="2026-04-22 18:46:50.038091684 +0000 UTC m=+21.717317980" watchObservedRunningTime="2026-04-22 18:46:50.038561196 +0000 UTC m=+21.717787490" Apr 22 18:46:50.242781 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.242757 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:46:50.817352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.817247 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:46:50.242775394Z","UUID":"5b40d8f3-f279-4115-8acc-9904dbe24b5a","Handler":null,"Name":"","Endpoint":""} Apr 22 18:46:50.819379 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.819356 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:46:50.819379 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.819384 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:46:50.836810 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.836783 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:50.836944 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:50.836898 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:50.837006 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:50.836784 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:50.837191 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:50.837131 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:51.010622 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:51.010578 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" event={"ID":"977e4b6c-16dc-40ce-90ae-98209ae297a8","Type":"ContainerStarted","Data":"c34e23c9e4e0d0715e5c80d027e7e3429f0de42ac368724da5774993149fa7c9"} Apr 22 18:46:52.015685 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:52.015648 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" event={"ID":"df350c2f-f320-4324-915f-7ba97845f4cf","Type":"ContainerStarted","Data":"54880911fdb34330437cdfae291bec2a3b2c28ce504aeaad4fb997e6e850407f"} Apr 22 18:46:52.017942 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:52.017708 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" event={"ID":"977e4b6c-16dc-40ce-90ae-98209ae297a8","Type":"ContainerStarted","Data":"d04968da9c466fd385081af92d593fad4aeee039e9f0022250552cf2b13c679c"} Apr 22 18:46:52.035537 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:52.035491 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2bqlg" podStartSLOduration=4.285925971 podStartE2EDuration="24.035478089s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.314135622 +0000 UTC m=+2.993361897" lastFinishedPulling="2026-04-22 18:46:51.063687727 +0000 UTC m=+22.742914015" observedRunningTime="2026-04-22 18:46:52.033816211 +0000 UTC m=+23.713042502" watchObservedRunningTime="2026-04-22 18:46:52.035478089 +0000 UTC m=+23.714704380" Apr 22 18:46:52.637217 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:52.637187 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:52.637784 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:52.637765 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:52.836766 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:52.836730 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:52.836766 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:52.836756 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:52.836987 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:52.836848 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:52.837069 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:52.837003 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:53.019650 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:53.019618 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:53.020248 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:53.020095 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-m7cgq" Apr 22 18:46:54.837190 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:54.836962 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:54.837554 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:54.836964 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:54.837554 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:54.837224 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:54.837554 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:54.837348 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:55.024751 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:55.024721 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" event={"ID":"df350c2f-f320-4324-915f-7ba97845f4cf","Type":"ContainerStarted","Data":"3d9961c7b2c861e95439b1ed3cd48b0455ef64f590607616ce36f577937a55ea"} Apr 22 18:46:55.025047 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:55.025005 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:55.026409 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:55.026386 2562 generic.go:358] "Generic (PLEG): container finished" podID="7ff492a4-8166-428f-aa35-d7319e606032" containerID="ba0d6e0b5ceb8a8945f98e13b52d36f442ea7e74aad4e8d11d9caeec8b3a65e4" exitCode=0 Apr 22 18:46:55.026492 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:55.026445 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" event={"ID":"7ff492a4-8166-428f-aa35-d7319e606032","Type":"ContainerDied","Data":"ba0d6e0b5ceb8a8945f98e13b52d36f442ea7e74aad4e8d11d9caeec8b3a65e4"} Apr 22 18:46:55.039522 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:55.039502 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:55.052180 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:55.052141 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" podStartSLOduration=8.490355142 podStartE2EDuration="26.052129884s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.326473529 +0000 UTC m=+3.005699800" lastFinishedPulling="2026-04-22 18:46:48.888248253 +0000 UTC m=+20.567474542" observedRunningTime="2026-04-22 18:46:55.05166949 +0000 UTC m=+26.730895781" watchObservedRunningTime="2026-04-22 18:46:55.052129884 +0000 UTC m=+26.731356175" Apr 22 18:46:56.029937 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:56.029910 2562 generic.go:358] "Generic (PLEG): container finished" podID="7ff492a4-8166-428f-aa35-d7319e606032" containerID="2058d6e34674f7555b9f2fd552b1002b438fbd7c595a32f900ab3b7833c740b5" exitCode=0 Apr 22 18:46:56.030297 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:56.029984 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" event={"ID":"7ff492a4-8166-428f-aa35-d7319e606032","Type":"ContainerDied","Data":"2058d6e34674f7555b9f2fd552b1002b438fbd7c595a32f900ab3b7833c740b5"} Apr 22 18:46:56.030297 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:56.030241 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:56.030690 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:56.030671 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:56.044614 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:56.044592 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:46:56.836995 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:56.836973 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:56.837087 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:56.836979 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:56.837185 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:56.837084 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:56.837237 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:56.837217 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:46:57.032847 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:57.032817 2562 generic.go:358] "Generic (PLEG): container finished" podID="7ff492a4-8166-428f-aa35-d7319e606032" containerID="658bc03ba20910b4cad4a4f84da4e41a3de280ad316cd2faab88fb39da0921a3" exitCode=0 Apr 22 18:46:57.033227 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:57.032885 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" event={"ID":"7ff492a4-8166-428f-aa35-d7319e606032","Type":"ContainerDied","Data":"658bc03ba20910b4cad4a4f84da4e41a3de280ad316cd2faab88fb39da0921a3"} Apr 22 18:46:58.839157 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:58.838504 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:46:58.839157 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:58.838635 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:46:58.839690 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:46:58.839217 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:46:58.839690 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:46:58.839475 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:47:00.643786 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:00.643477 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9g8sf"] Apr 22 18:47:00.644324 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:00.643823 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:47:00.644324 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:00.643941 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:47:00.658379 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:00.658214 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vbxg"] Apr 22 18:47:00.658379 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:00.658323 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:47:00.658769 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:00.658664 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:47:01.836907 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:01.836873 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:47:01.836907 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:01.836883 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:47:01.837548 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:01.836974 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:47:01.837548 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:01.837130 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:47:02.500754 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:02.500710 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:47:02.500987 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:02.500884 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:02.500987 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:02.500906 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:02.500987 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:02.500916 2562 projected.go:194] Error preparing data for projected volume kube-api-access-47h8s for pod openshift-network-diagnostics/network-check-target-9g8sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:02.500987 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:02.500979 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s podName:0b614b94-c0eb-4f39-add0-c46922389f94 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:34.500961634 +0000 UTC m=+66.180187903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-47h8s" (UniqueName: "kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s") pod "network-check-target-9g8sf" (UID: "0b614b94-c0eb-4f39-add0-c46922389f94") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:02.601966 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:02.601880 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:47:02.602146 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:02.602030 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:02.602146 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:02.602123 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs podName:da9bff9a-df34-4fcf-9338-631fbb086e31 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:34.602102467 +0000 UTC m=+66.281328738 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs") pod "network-metrics-daemon-4vbxg" (UID: "da9bff9a-df34-4fcf-9338-631fbb086e31") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:03.836154 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:03.836123 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:47:03.836686 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:03.836123 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:47:03.836686 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:03.836227 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9g8sf" podUID="0b614b94-c0eb-4f39-add0-c46922389f94" Apr 22 18:47:03.836686 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:03.836300 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vbxg" podUID="da9bff9a-df34-4fcf-9338-631fbb086e31" Apr 22 18:47:04.049513 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.049478 2562 generic.go:358] "Generic (PLEG): container finished" podID="7ff492a4-8166-428f-aa35-d7319e606032" containerID="c84a8a8e2c7974388dfdfe9575eb0c96b8527d3d604377ba2c05112fa5fce823" exitCode=0 Apr 22 18:47:04.049667 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.049524 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" event={"ID":"7ff492a4-8166-428f-aa35-d7319e606032","Type":"ContainerDied","Data":"c84a8a8e2c7974388dfdfe9575eb0c96b8527d3d604377ba2c05112fa5fce823"} Apr 22 18:47:04.644367 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.644342 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-208.ec2.internal" event="NodeReady" Apr 22 18:47:04.644523 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.644444 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:47:04.686134 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.686109 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kcplt"] Apr 22 18:47:04.692849 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.692829 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j7m2r"] Apr 22 18:47:04.693004 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.692985 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.695404 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.695385 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:47:04.695524 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.695505 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:47:04.695524 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.695520 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gf99c\"" Apr 22 18:47:04.695626 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.695579 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:47:04.695704 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.695680 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:47:04.700373 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.700355 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kcplt"] Apr 22 18:47:04.700517 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.700499 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j7m2r"] Apr 22 18:47:04.700517 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.700466 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.703425 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.703379 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnqbz\"" Apr 22 18:47:04.703425 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.703380 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:47:04.703747 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.703726 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:47:04.788697 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.788673 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zzrc9"] Apr 22 18:47:04.804277 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.804249 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zzrc9"] Apr 22 18:47:04.804414 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.804375 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zzrc9" Apr 22 18:47:04.806878 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.806858 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:47:04.806994 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.806876 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5j5kr\"" Apr 22 18:47:04.806994 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.806863 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:47:04.806994 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.806912 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:47:04.822156 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.822135 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/71884152-334a-4b9f-8f06-c42c443f8518-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.822250 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.822164 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2fb4248-956b-4863-9bda-6b409ba13de6-tmp-dir\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.822250 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.822202 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgmvs\" (UniqueName: \"kubernetes.io/projected/71884152-334a-4b9f-8f06-c42c443f8518-kube-api-access-jgmvs\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.822350 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.822257 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwmcv\" (UniqueName: \"kubernetes.io/projected/d2fb4248-956b-4863-9bda-6b409ba13de6-kube-api-access-jwmcv\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.822350 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.822296 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2fb4248-956b-4863-9bda-6b409ba13de6-metrics-tls\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.822350 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.822333 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/71884152-334a-4b9f-8f06-c42c443f8518-data-volume\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.822467 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.822382 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2fb4248-956b-4863-9bda-6b409ba13de6-config-volume\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.822467 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.822446 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/71884152-334a-4b9f-8f06-c42c443f8518-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.822575 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.822473 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/71884152-334a-4b9f-8f06-c42c443f8518-crio-socket\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.923149 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923059 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2fb4248-956b-4863-9bda-6b409ba13de6-config-volume\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.923149 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923103 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/71884152-334a-4b9f-8f06-c42c443f8518-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.923149 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923131 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8457890c-9b4b-4d2b-b315-2a7b5aaa060e-cert\") pod \"ingress-canary-zzrc9\" (UID: \"8457890c-9b4b-4d2b-b315-2a7b5aaa060e\") " pod="openshift-ingress-canary/ingress-canary-zzrc9" Apr 22 18:47:04.923641 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923184 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/71884152-334a-4b9f-8f06-c42c443f8518-crio-socket\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.923641 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923250 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv4px\" (UniqueName: \"kubernetes.io/projected/8457890c-9b4b-4d2b-b315-2a7b5aaa060e-kube-api-access-lv4px\") pod \"ingress-canary-zzrc9\" (UID: \"8457890c-9b4b-4d2b-b315-2a7b5aaa060e\") " pod="openshift-ingress-canary/ingress-canary-zzrc9" Apr 22 18:47:04.923641 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923520 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/71884152-334a-4b9f-8f06-c42c443f8518-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.923641 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923306 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/71884152-334a-4b9f-8f06-c42c443f8518-crio-socket\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.923641 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923549 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2fb4248-956b-4863-9bda-6b409ba13de6-tmp-dir\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.923881 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923641 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2fb4248-956b-4863-9bda-6b409ba13de6-config-volume\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.923881 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923685 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgmvs\" (UniqueName: \"kubernetes.io/projected/71884152-334a-4b9f-8f06-c42c443f8518-kube-api-access-jgmvs\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.923881 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923722 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwmcv\" (UniqueName: \"kubernetes.io/projected/d2fb4248-956b-4863-9bda-6b409ba13de6-kube-api-access-jwmcv\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.923881 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923760 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2fb4248-956b-4863-9bda-6b409ba13de6-metrics-tls\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.923881 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923803 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2fb4248-956b-4863-9bda-6b409ba13de6-tmp-dir\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.923881 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.923821 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/71884152-334a-4b9f-8f06-c42c443f8518-data-volume\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.924154 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.924132 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/71884152-334a-4b9f-8f06-c42c443f8518-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.927101 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.927075 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/71884152-334a-4b9f-8f06-c42c443f8518-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.927204 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.927077 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2fb4248-956b-4863-9bda-6b409ba13de6-metrics-tls\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.931930 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.931902 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgmvs\" (UniqueName: \"kubernetes.io/projected/71884152-334a-4b9f-8f06-c42c443f8518-kube-api-access-jgmvs\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:04.932291 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.932271 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwmcv\" (UniqueName: \"kubernetes.io/projected/d2fb4248-956b-4863-9bda-6b409ba13de6-kube-api-access-jwmcv\") pod \"dns-default-j7m2r\" (UID: \"d2fb4248-956b-4863-9bda-6b409ba13de6\") " pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:04.932545 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:04.932523 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/71884152-334a-4b9f-8f06-c42c443f8518-data-volume\") pod \"insights-runtime-extractor-kcplt\" (UID: \"71884152-334a-4b9f-8f06-c42c443f8518\") " pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:05.005129 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.005101 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kcplt" Apr 22 18:47:05.010830 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.010804 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:05.025208 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.024575 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8457890c-9b4b-4d2b-b315-2a7b5aaa060e-cert\") pod \"ingress-canary-zzrc9\" (UID: \"8457890c-9b4b-4d2b-b315-2a7b5aaa060e\") " pod="openshift-ingress-canary/ingress-canary-zzrc9" Apr 22 18:47:05.025335 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.025262 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv4px\" (UniqueName: \"kubernetes.io/projected/8457890c-9b4b-4d2b-b315-2a7b5aaa060e-kube-api-access-lv4px\") pod \"ingress-canary-zzrc9\" (UID: \"8457890c-9b4b-4d2b-b315-2a7b5aaa060e\") " pod="openshift-ingress-canary/ingress-canary-zzrc9" Apr 22 18:47:05.027397 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.027378 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8457890c-9b4b-4d2b-b315-2a7b5aaa060e-cert\") pod \"ingress-canary-zzrc9\" (UID: \"8457890c-9b4b-4d2b-b315-2a7b5aaa060e\") " pod="openshift-ingress-canary/ingress-canary-zzrc9" Apr 22 18:47:05.043319 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.043288 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv4px\" (UniqueName: \"kubernetes.io/projected/8457890c-9b4b-4d2b-b315-2a7b5aaa060e-kube-api-access-lv4px\") pod \"ingress-canary-zzrc9\" (UID: \"8457890c-9b4b-4d2b-b315-2a7b5aaa060e\") " pod="openshift-ingress-canary/ingress-canary-zzrc9" Apr 22 18:47:05.054538 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.054488 2562 generic.go:358] "Generic (PLEG): container finished" podID="7ff492a4-8166-428f-aa35-d7319e606032" containerID="c62c0439d8d48409279e07deef79cde771c8358eb3ba28baa56383003fc4cca2" exitCode=0 Apr 22 18:47:05.054538 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.054527 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" event={"ID":"7ff492a4-8166-428f-aa35-d7319e606032","Type":"ContainerDied","Data":"c62c0439d8d48409279e07deef79cde771c8358eb3ba28baa56383003fc4cca2"} Apr 22 18:47:05.120765 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.112226 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zzrc9" Apr 22 18:47:05.177324 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.177222 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j7m2r"] Apr 22 18:47:05.179247 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.179196 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kcplt"] Apr 22 18:47:05.184595 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:05.184543 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71884152_334a_4b9f_8f06_c42c443f8518.slice/crio-fee136f19ebaec605ca32fd007beb3bcc9812401b74e9659e1ad157d7b2d77c3 WatchSource:0}: Error finding container fee136f19ebaec605ca32fd007beb3bcc9812401b74e9659e1ad157d7b2d77c3: Status 404 returned error can't find the container with id fee136f19ebaec605ca32fd007beb3bcc9812401b74e9659e1ad157d7b2d77c3 Apr 22 18:47:05.262205 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.262065 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zzrc9"] Apr 22 18:47:05.265058 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:05.265005 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8457890c_9b4b_4d2b_b315_2a7b5aaa060e.slice/crio-51aea1c1389bd5ad69ec10139bf3dd293cafd6c5a80551b61ec8df8be7df2556 WatchSource:0}: Error finding container 51aea1c1389bd5ad69ec10139bf3dd293cafd6c5a80551b61ec8df8be7df2556: Status 404 returned error can't find the container with id 51aea1c1389bd5ad69ec10139bf3dd293cafd6c5a80551b61ec8df8be7df2556 Apr 22 18:47:05.837078 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.836686 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:47:05.837078 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.836990 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:47:05.840099 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.840068 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:05.840300 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.840281 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:05.840828 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.840804 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:05.841127 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.841090 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-scwsl\"" Apr 22 18:47:05.841234 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:05.841128 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4wrrh\"" Apr 22 18:47:06.063683 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.063648 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" event={"ID":"7ff492a4-8166-428f-aa35-d7319e606032","Type":"ContainerStarted","Data":"75b97859f7590643c7d7d22ae5731829bbfd28b95f01fcb7f62f156ec95dc61a"} Apr 22 18:47:06.065303 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.065231 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kcplt" event={"ID":"71884152-334a-4b9f-8f06-c42c443f8518","Type":"ContainerStarted","Data":"eefd2da01351d34a65f8ce9ef5628bbaf13bd49fd9334cd01646b3421bf00176"} Apr 22 18:47:06.065303 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.065281 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kcplt" event={"ID":"71884152-334a-4b9f-8f06-c42c443f8518","Type":"ContainerStarted","Data":"fee136f19ebaec605ca32fd007beb3bcc9812401b74e9659e1ad157d7b2d77c3"} Apr 22 18:47:06.066314 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.066292 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zzrc9" event={"ID":"8457890c-9b4b-4d2b-b315-2a7b5aaa060e","Type":"ContainerStarted","Data":"51aea1c1389bd5ad69ec10139bf3dd293cafd6c5a80551b61ec8df8be7df2556"} Apr 22 18:47:06.067369 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.067345 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j7m2r" event={"ID":"d2fb4248-956b-4863-9bda-6b409ba13de6","Type":"ContainerStarted","Data":"71bb572ef951395c566b7d4a9eadcdcb6650acdc2f8f8ff600e26a36de01095e"} Apr 22 18:47:06.951882 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.951811 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rwjzd" podStartSLOduration=6.173614432 podStartE2EDuration="37.951789748s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.324382287 +0000 UTC m=+3.003608563" lastFinishedPulling="2026-04-22 18:47:03.102557596 +0000 UTC m=+34.781783879" observedRunningTime="2026-04-22 18:47:06.088381992 +0000 UTC m=+37.767608285" watchObservedRunningTime="2026-04-22 18:47:06.951789748 +0000 UTC m=+38.631016036" Apr 22 18:47:06.952390 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.952372 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lz682"] Apr 22 18:47:06.991289 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.991236 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mzh9k"] Apr 22 18:47:06.991449 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.991294 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:06.994388 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.994364 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:47:06.994490 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.994458 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:47:06.994725 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.994702 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:47:06.994830 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.994731 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:47:06.994830 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.994799 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:47:06.995317 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:06.995296 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-2j68g\"" Apr 22 18:47:07.009432 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.009364 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jhkhr"] Apr 22 18:47:07.009549 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.009500 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.012588 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.012569 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-zx5db\"" Apr 22 18:47:07.012695 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.012628 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:47:07.012763 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.012704 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:47:07.012813 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.012802 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:47:07.028222 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.028197 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lz682"] Apr 22 18:47:07.028310 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.028226 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mzh9k"] Apr 22 18:47:07.028378 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.028359 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.030767 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.030748 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7zc6s\"" Apr 22 18:47:07.030972 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.030956 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:47:07.031134 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.031099 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:47:07.031249 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.031230 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:47:07.141261 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141222 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78d16b88-530b-4cf5-a6d2-d70097c71800-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141279 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/74910032-40a3-4178-a85b-c09cd90b2d70-root\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141336 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78d16b88-530b-4cf5-a6d2-d70097c71800-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141394 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-textfile\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141467 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78d16b88-530b-4cf5-a6d2-d70097c71800-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141507 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141540 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141568 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-accelerators-collector-config\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141632 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74910032-40a3-4178-a85b-c09cd90b2d70-metrics-client-ca\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141653 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2841b4ea-b22d-494c-89ef-c9e8cea71efd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141686 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141728 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74910032-40a3-4178-a85b-c09cd90b2d70-sys\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141760 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-wtmp\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141790 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141813 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2841b4ea-b22d-494c-89ef-c9e8cea71efd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.141834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141843 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-tls\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.142493 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141888 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhvn4\" (UniqueName: \"kubernetes.io/projected/74910032-40a3-4178-a85b-c09cd90b2d70-kube-api-access-rhvn4\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.142493 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141923 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l5r2\" (UniqueName: \"kubernetes.io/projected/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-api-access-4l5r2\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.142493 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.141941 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gntf8\" (UniqueName: \"kubernetes.io/projected/78d16b88-530b-4cf5-a6d2-d70097c71800-kube-api-access-gntf8\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.242840 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.242764 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78d16b88-530b-4cf5-a6d2-d70097c71800-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.242840 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.242814 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/74910032-40a3-4178-a85b-c09cd90b2d70-root\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243097 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.242844 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78d16b88-530b-4cf5-a6d2-d70097c71800-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.243097 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.242863 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-textfile\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243097 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.242896 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78d16b88-530b-4cf5-a6d2-d70097c71800-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.243097 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.242929 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243097 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.242939 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/74910032-40a3-4178-a85b-c09cd90b2d70-root\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243097 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.242960 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.243097 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243009 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-accelerators-collector-config\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243097 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243094 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74910032-40a3-4178-a85b-c09cd90b2d70-metrics-client-ca\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243469 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243125 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2841b4ea-b22d-494c-89ef-c9e8cea71efd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.243469 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243342 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-textfile\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243469 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243387 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.243469 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243434 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74910032-40a3-4178-a85b-c09cd90b2d70-sys\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243469 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243459 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-wtmp\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243491 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.243708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243521 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2841b4ea-b22d-494c-89ef-c9e8cea71efd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.243708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243550 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-tls\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243609 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhvn4\" (UniqueName: \"kubernetes.io/projected/74910032-40a3-4178-a85b-c09cd90b2d70-kube-api-access-rhvn4\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.243708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243639 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l5r2\" (UniqueName: \"kubernetes.io/projected/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-api-access-4l5r2\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.243708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243667 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gntf8\" (UniqueName: \"kubernetes.io/projected/78d16b88-530b-4cf5-a6d2-d70097c71800-kube-api-access-gntf8\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.243708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243665 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2841b4ea-b22d-494c-89ef-c9e8cea71efd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.244060 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243705 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-accelerators-collector-config\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.244060 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243750 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/74910032-40a3-4178-a85b-c09cd90b2d70-metrics-client-ca\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.244060 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243768 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78d16b88-530b-4cf5-a6d2-d70097c71800-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.244060 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:07.243872 2562 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:47:07.244060 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.243922 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74910032-40a3-4178-a85b-c09cd90b2d70-sys\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.244060 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:07.243958 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-tls podName:74910032-40a3-4178-a85b-c09cd90b2d70 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:07.743937357 +0000 UTC m=+39.423163642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-tls") pod "node-exporter-jhkhr" (UID: "74910032-40a3-4178-a85b-c09cd90b2d70") : secret "node-exporter-tls" not found Apr 22 18:47:07.244361 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.244066 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-wtmp\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.244361 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.244265 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2841b4ea-b22d-494c-89ef-c9e8cea71efd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.244562 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.244522 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.247175 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.247150 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.247267 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.247204 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.247332 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.247267 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78d16b88-530b-4cf5-a6d2-d70097c71800-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.247898 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.247870 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.248537 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.248516 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78d16b88-530b-4cf5-a6d2-d70097c71800-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.255566 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.255538 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntf8\" (UniqueName: \"kubernetes.io/projected/78d16b88-530b-4cf5-a6d2-d70097c71800-kube-api-access-gntf8\") pod \"openshift-state-metrics-9d44df66c-lz682\" (UID: \"78d16b88-530b-4cf5-a6d2-d70097c71800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.255834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.255755 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l5r2\" (UniqueName: \"kubernetes.io/projected/2841b4ea-b22d-494c-89ef-c9e8cea71efd-kube-api-access-4l5r2\") pod \"kube-state-metrics-69db897b98-mzh9k\" (UID: \"2841b4ea-b22d-494c-89ef-c9e8cea71efd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.255998 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.255977 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhvn4\" (UniqueName: \"kubernetes.io/projected/74910032-40a3-4178-a85b-c09cd90b2d70-kube-api-access-rhvn4\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.301774 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.301742 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" Apr 22 18:47:07.320990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.320941 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" Apr 22 18:47:07.747656 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.747626 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-tls\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.755578 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.755512 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/74910032-40a3-4178-a85b-c09cd90b2d70-node-exporter-tls\") pod \"node-exporter-jhkhr\" (UID: \"74910032-40a3-4178-a85b-c09cd90b2d70\") " pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.901092 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.901049 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mzh9k"] Apr 22 18:47:07.905231 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:07.905197 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2841b4ea_b22d_494c_89ef_c9e8cea71efd.slice/crio-60469160b9458c1ad26a95b17d213ddf9dbac2baf7fa8bc831d79b7016e33463 WatchSource:0}: Error finding container 60469160b9458c1ad26a95b17d213ddf9dbac2baf7fa8bc831d79b7016e33463: Status 404 returned error can't find the container with id 60469160b9458c1ad26a95b17d213ddf9dbac2baf7fa8bc831d79b7016e33463 Apr 22 18:47:07.927800 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.927774 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-lz682"] Apr 22 18:47:07.932139 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:07.932113 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d16b88_530b_4cf5_a6d2_d70097c71800.slice/crio-597a3b5e42ad81a8a7accb9d7d95bff8a1be8801e782286702f85c3f05b52ad0 WatchSource:0}: Error finding container 597a3b5e42ad81a8a7accb9d7d95bff8a1be8801e782286702f85c3f05b52ad0: Status 404 returned error can't find the container with id 597a3b5e42ad81a8a7accb9d7d95bff8a1be8801e782286702f85c3f05b52ad0 Apr 22 18:47:07.938337 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:07.938309 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jhkhr" Apr 22 18:47:07.953371 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:07.953228 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74910032_40a3_4178_a85b_c09cd90b2d70.slice/crio-b81ddff8ee28c7fc391a4274e04258cfdfa460720e46061e3d94f87872fa9e1a WatchSource:0}: Error finding container b81ddff8ee28c7fc391a4274e04258cfdfa460720e46061e3d94f87872fa9e1a: Status 404 returned error can't find the container with id b81ddff8ee28c7fc391a4274e04258cfdfa460720e46061e3d94f87872fa9e1a Apr 22 18:47:08.067392 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.067360 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:47:08.079868 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.079837 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j7m2r" event={"ID":"d2fb4248-956b-4863-9bda-6b409ba13de6","Type":"ContainerStarted","Data":"b25cd9e61c9365effdf299f38526142434f98c344d2ed6fda70529a67fa63cda"} Apr 22 18:47:08.079868 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.079867 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kcplt" event={"ID":"71884152-334a-4b9f-8f06-c42c443f8518","Type":"ContainerStarted","Data":"326baa54ac64527c26d9d1e0e1f9a2f0762177adab830d20d4f0acf6f76f9b18"} Apr 22 18:47:08.080038 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.079881 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" event={"ID":"2841b4ea-b22d-494c-89ef-c9e8cea71efd","Type":"ContainerStarted","Data":"60469160b9458c1ad26a95b17d213ddf9dbac2baf7fa8bc831d79b7016e33463"} Apr 22 18:47:08.080038 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.079895 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zzrc9" event={"ID":"8457890c-9b4b-4d2b-b315-2a7b5aaa060e","Type":"ContainerStarted","Data":"150d828baa5c72d342f2b0498c15d020435758d0f0b7644cf54db3d2f86ac72d"} Apr 22 18:47:08.080038 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.079909 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jhkhr" event={"ID":"74910032-40a3-4178-a85b-c09cd90b2d70","Type":"ContainerStarted","Data":"b81ddff8ee28c7fc391a4274e04258cfdfa460720e46061e3d94f87872fa9e1a"} Apr 22 18:47:08.080038 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.079923 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" event={"ID":"78d16b88-530b-4cf5-a6d2-d70097c71800","Type":"ContainerStarted","Data":"361981ae1127db835ad964d75858664881fe533e06514f7c963286e560886efd"} Apr 22 18:47:08.080038 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.079938 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" event={"ID":"78d16b88-530b-4cf5-a6d2-d70097c71800","Type":"ContainerStarted","Data":"597a3b5e42ad81a8a7accb9d7d95bff8a1be8801e782286702f85c3f05b52ad0"} Apr 22 18:47:08.080291 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.080043 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.082819 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.082792 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-czfzj\"" Apr 22 18:47:08.084061 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.083377 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:47:08.084061 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.083404 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:47:08.084061 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.083438 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:47:08.084061 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.083626 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:47:08.084061 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.083837 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:47:08.084061 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.084010 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:47:08.084061 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.084048 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:47:08.084433 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.084169 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:47:08.084714 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.084550 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:47:08.085184 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.085143 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:47:08.097286 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.097243 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zzrc9" podStartSLOduration=1.614627396 podStartE2EDuration="4.097231544s" podCreationTimestamp="2026-04-22 18:47:04 +0000 UTC" firstStartedPulling="2026-04-22 18:47:05.266961012 +0000 UTC m=+36.946187282" lastFinishedPulling="2026-04-22 18:47:07.749565122 +0000 UTC m=+39.428791430" observedRunningTime="2026-04-22 18:47:08.096948379 +0000 UTC m=+39.776174670" watchObservedRunningTime="2026-04-22 18:47:08.097231544 +0000 UTC m=+39.776457833" Apr 22 18:47:08.251653 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.251623 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.251653 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.251661 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.251686 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.251778 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.251942 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.252000 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.252058 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-volume\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.252085 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.252110 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-web-config\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.252138 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.252166 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-out\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.252192 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.252438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.252233 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbf5f\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-kube-api-access-tbf5f\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.353606 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353506 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.353606 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353568 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.353606 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353604 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-volume\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.353854 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353627 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.353854 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353649 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-web-config\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.353854 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353677 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.353854 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353700 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-out\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.353854 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:08.353726 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-trusted-ca-bundle podName:d7f3508d-ae22-4e4a-bae0-0df307c79003 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:08.853701484 +0000 UTC m=+40.532927768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003") : configmap references non-existent config key: ca-bundle.crt Apr 22 18:47:08.353854 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353770 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.353854 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353818 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbf5f\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-kube-api-access-tbf5f\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.354264 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353860 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.354264 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353890 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.354264 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353916 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.354264 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.353944 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.354559 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.354535 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.355002 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.354672 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.357104 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.356802 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.358580 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.358551 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-web-config\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.358679 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.358650 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.359212 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.359188 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.359663 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.359618 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.360075 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.360052 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-volume\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.360229 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.360162 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-out\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.360729 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.360686 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.360939 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.360918 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.373886 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.373857 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbf5f\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-kube-api-access-tbf5f\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.858737 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.858688 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:08.859618 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:08.859567 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:09.002451 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.002421 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:09.057139 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.057108 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-64f989975c-2xc7s"] Apr 22 18:47:09.076872 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.076845 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-64f989975c-2xc7s"] Apr 22 18:47:09.077090 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.077070 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.079657 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.079631 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:47:09.079779 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.079690 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:47:09.079850 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.079831 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2gav3rfqid1s2\"" Apr 22 18:47:09.079917 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.079902 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-kl5k7\"" Apr 22 18:47:09.080074 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.080061 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:47:09.080290 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.080265 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:47:09.080397 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.080296 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:47:09.083253 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.083228 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" event={"ID":"78d16b88-530b-4cf5-a6d2-d70097c71800","Type":"ContainerStarted","Data":"c26f31e4ab7a13b9e6484d166ee37c4cea3f04cf8b351230d62ee145ff520f76"} Apr 22 18:47:09.085277 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.085253 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j7m2r" event={"ID":"d2fb4248-956b-4863-9bda-6b409ba13de6","Type":"ContainerStarted","Data":"2b4367e653a8e34e72e07f4dc4960acc75875df6aecacde6445141f1443d94ed"} Apr 22 18:47:09.110760 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.110658 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j7m2r" podStartSLOduration=2.547235352 podStartE2EDuration="5.110643717s" podCreationTimestamp="2026-04-22 18:47:04 +0000 UTC" firstStartedPulling="2026-04-22 18:47:05.18395373 +0000 UTC m=+36.863180001" lastFinishedPulling="2026-04-22 18:47:07.747362086 +0000 UTC m=+39.426588366" observedRunningTime="2026-04-22 18:47:09.109365557 +0000 UTC m=+40.788591848" watchObservedRunningTime="2026-04-22 18:47:09.110643717 +0000 UTC m=+40.789870009" Apr 22 18:47:09.262396 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.262346 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-tls\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.262775 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.262413 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.262775 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.262472 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.262775 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.262498 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/814b00bd-8708-4647-b738-0041818be65e-metrics-client-ca\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.262775 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.262564 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.262775 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.262601 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.262775 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.262645 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-grpc-tls\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.262775 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.262670 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v5mg\" (UniqueName: \"kubernetes.io/projected/814b00bd-8708-4647-b738-0041818be65e-kube-api-access-4v5mg\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.363677 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.363594 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.363677 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.363659 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.363677 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.363679 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/814b00bd-8708-4647-b738-0041818be65e-metrics-client-ca\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.363934 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.363716 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.363934 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.363741 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.363934 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.363770 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-grpc-tls\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.363934 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.363872 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4v5mg\" (UniqueName: \"kubernetes.io/projected/814b00bd-8708-4647-b738-0041818be65e-kube-api-access-4v5mg\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.364126 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.363987 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-tls\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.364668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.364635 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/814b00bd-8708-4647-b738-0041818be65e-metrics-client-ca\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.366888 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.366831 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.366888 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.366859 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.367096 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.367075 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-tls\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.367249 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.367231 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-grpc-tls\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.367444 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.367425 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.367507 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.367427 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/814b00bd-8708-4647-b738-0041818be65e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.372100 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.372078 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v5mg\" (UniqueName: \"kubernetes.io/projected/814b00bd-8708-4647-b738-0041818be65e-kube-api-access-4v5mg\") pod \"thanos-querier-64f989975c-2xc7s\" (UID: \"814b00bd-8708-4647-b738-0041818be65e\") " pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:09.388840 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:09.388811 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:10.087696 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:10.087674 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:10.707361 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:10.707337 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-64f989975c-2xc7s"] Apr 22 18:47:10.710331 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:10.710309 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod814b00bd_8708_4647_b738_0041818be65e.slice/crio-1a38a43dca1219bed22823737a7c0f685ee4dc02cd2bec23b960ef24179476f8 WatchSource:0}: Error finding container 1a38a43dca1219bed22823737a7c0f685ee4dc02cd2bec23b960ef24179476f8: Status 404 returned error can't find the container with id 1a38a43dca1219bed22823737a7c0f685ee4dc02cd2bec23b960ef24179476f8 Apr 22 18:47:10.725106 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:10.725069 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:47:10.848981 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:10.848950 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7f3508d_ae22_4e4a_bae0_0df307c79003.slice/crio-5650c9910a8d48dd5282ec4bfdb29d0404771f79ab87e9ac27e15a80019f64da WatchSource:0}: Error finding container 5650c9910a8d48dd5282ec4bfdb29d0404771f79ab87e9ac27e15a80019f64da: Status 404 returned error can't find the container with id 5650c9910a8d48dd5282ec4bfdb29d0404771f79ab87e9ac27e15a80019f64da Apr 22 18:47:11.092485 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.092451 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kcplt" event={"ID":"71884152-334a-4b9f-8f06-c42c443f8518","Type":"ContainerStarted","Data":"a11b00ce98f20e956ab65501efbe64a16273eac19db9b2212723fbcef2750a8a"} Apr 22 18:47:11.094395 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.094365 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" event={"ID":"2841b4ea-b22d-494c-89ef-c9e8cea71efd","Type":"ContainerStarted","Data":"ed6c564abc9a14c6ef3a7eecb296cad25c9d7a06229e833a73477d675c12f967"} Apr 22 18:47:11.094509 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.094401 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" event={"ID":"2841b4ea-b22d-494c-89ef-c9e8cea71efd","Type":"ContainerStarted","Data":"63bd4c2319a1a718bf622ea5fd0b8b9c6350c7f6f36e12cd924ffd3639b8146f"} Apr 22 18:47:11.094509 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.094417 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" event={"ID":"2841b4ea-b22d-494c-89ef-c9e8cea71efd","Type":"ContainerStarted","Data":"18217a288dafc6cf6467fde296193f9b83f9b8d5c649c9b642fedb0eebc515da"} Apr 22 18:47:11.095520 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.095468 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" event={"ID":"814b00bd-8708-4647-b738-0041818be65e","Type":"ContainerStarted","Data":"1a38a43dca1219bed22823737a7c0f685ee4dc02cd2bec23b960ef24179476f8"} Apr 22 18:47:11.097113 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.097087 2562 generic.go:358] "Generic (PLEG): container finished" podID="74910032-40a3-4178-a85b-c09cd90b2d70" containerID="f771c9b4bb3550084c932d9a82f88a1a595abdfd648edb76e7168b101ad8247f" exitCode=0 Apr 22 18:47:11.097199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.097177 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jhkhr" event={"ID":"74910032-40a3-4178-a85b-c09cd90b2d70","Type":"ContainerDied","Data":"f771c9b4bb3550084c932d9a82f88a1a595abdfd648edb76e7168b101ad8247f"} Apr 22 18:47:11.101724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.101704 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" event={"ID":"78d16b88-530b-4cf5-a6d2-d70097c71800","Type":"ContainerStarted","Data":"6640a042be49794b7e9795dc92b668057833fe644685ed41fd536e348eca0c8b"} Apr 22 18:47:11.102969 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.102947 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerStarted","Data":"5650c9910a8d48dd5282ec4bfdb29d0404771f79ab87e9ac27e15a80019f64da"} Apr 22 18:47:11.109084 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.109045 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kcplt" podStartSLOduration=1.8493633040000002 podStartE2EDuration="7.109010035s" podCreationTimestamp="2026-04-22 18:47:04 +0000 UTC" firstStartedPulling="2026-04-22 18:47:05.311499378 +0000 UTC m=+36.990725648" lastFinishedPulling="2026-04-22 18:47:10.571146108 +0000 UTC m=+42.250372379" observedRunningTime="2026-04-22 18:47:11.108237038 +0000 UTC m=+42.787463334" watchObservedRunningTime="2026-04-22 18:47:11.109010035 +0000 UTC m=+42.788236327" Apr 22 18:47:11.126430 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.126376 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-mzh9k" podStartSLOduration=2.4620169880000002 podStartE2EDuration="5.126361038s" podCreationTimestamp="2026-04-22 18:47:06 +0000 UTC" firstStartedPulling="2026-04-22 18:47:07.907201808 +0000 UTC m=+39.586428077" lastFinishedPulling="2026-04-22 18:47:10.571545853 +0000 UTC m=+42.250772127" observedRunningTime="2026-04-22 18:47:11.12479475 +0000 UTC m=+42.804021043" watchObservedRunningTime="2026-04-22 18:47:11.126361038 +0000 UTC m=+42.805587331" Apr 22 18:47:11.160087 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.160011 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-lz682" podStartSLOduration=2.441841876 podStartE2EDuration="5.159993009s" podCreationTimestamp="2026-04-22 18:47:06 +0000 UTC" firstStartedPulling="2026-04-22 18:47:08.14215884 +0000 UTC m=+39.821385110" lastFinishedPulling="2026-04-22 18:47:10.860309968 +0000 UTC m=+42.539536243" observedRunningTime="2026-04-22 18:47:11.159000693 +0000 UTC m=+42.838226985" watchObservedRunningTime="2026-04-22 18:47:11.159993009 +0000 UTC m=+42.839219302" Apr 22 18:47:11.355886 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.355835 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-864cb4674-xgnfz"] Apr 22 18:47:11.369494 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.369468 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-864cb4674-xgnfz"] Apr 22 18:47:11.369636 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.369612 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.372361 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.372340 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:47:11.372361 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.372355 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:47:11.372496 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.372478 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9qbi95sppbcjb\"" Apr 22 18:47:11.372612 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.372598 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:47:11.372697 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.372635 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:47:11.372913 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.372888 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-4nfqc\"" Apr 22 18:47:11.379092 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.379069 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7cfd4bf2-8f28-408b-a116-ad7939016998-secret-metrics-server-client-certs\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.379174 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.379102 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6d7r\" (UniqueName: \"kubernetes.io/projected/7cfd4bf2-8f28-408b-a116-ad7939016998-kube-api-access-x6d7r\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.379174 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.379161 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7cfd4bf2-8f28-408b-a116-ad7939016998-metrics-server-audit-profiles\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.379308 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.379198 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7cfd4bf2-8f28-408b-a116-ad7939016998-audit-log\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.379308 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.379261 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfd4bf2-8f28-408b-a116-ad7939016998-client-ca-bundle\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.379308 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.379304 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cfd4bf2-8f28-408b-a116-ad7939016998-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.379405 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.379334 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7cfd4bf2-8f28-408b-a116-ad7939016998-secret-metrics-server-tls\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.480481 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.480449 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7cfd4bf2-8f28-408b-a116-ad7939016998-metrics-server-audit-profiles\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.480626 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.480511 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7cfd4bf2-8f28-408b-a116-ad7939016998-audit-log\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.480626 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.480546 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfd4bf2-8f28-408b-a116-ad7939016998-client-ca-bundle\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.480626 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.480596 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cfd4bf2-8f28-408b-a116-ad7939016998-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.480786 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.480642 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7cfd4bf2-8f28-408b-a116-ad7939016998-secret-metrics-server-tls\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.480786 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.480670 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7cfd4bf2-8f28-408b-a116-ad7939016998-secret-metrics-server-client-certs\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.480786 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.480695 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6d7r\" (UniqueName: \"kubernetes.io/projected/7cfd4bf2-8f28-408b-a116-ad7939016998-kube-api-access-x6d7r\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.481212 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.481185 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7cfd4bf2-8f28-408b-a116-ad7939016998-audit-log\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.481601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.481571 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cfd4bf2-8f28-408b-a116-ad7939016998-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.483584 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.483559 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfd4bf2-8f28-408b-a116-ad7939016998-client-ca-bundle\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.483827 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.483800 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7cfd4bf2-8f28-408b-a116-ad7939016998-secret-metrics-server-tls\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.483911 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.483853 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/7cfd4bf2-8f28-408b-a116-ad7939016998-secret-metrics-server-client-certs\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.488683 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.488662 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6d7r\" (UniqueName: \"kubernetes.io/projected/7cfd4bf2-8f28-408b-a116-ad7939016998-kube-api-access-x6d7r\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.489526 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.489507 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7cfd4bf2-8f28-408b-a116-ad7939016998-metrics-server-audit-profiles\") pod \"metrics-server-864cb4674-xgnfz\" (UID: \"7cfd4bf2-8f28-408b-a116-ad7939016998\") " pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.680588 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.680489 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:11.712738 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.712703 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6"] Apr 22 18:47:11.755430 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.755398 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6"] Apr 22 18:47:11.755648 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.755629 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" Apr 22 18:47:11.759954 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.759098 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:47:11.759954 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.759330 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-cqt4t\"" Apr 22 18:47:11.783485 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.783426 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c42fbe22-3543-489f-99d7-1f5578ffae18-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4wbp6\" (UID: \"c42fbe22-3543-489f-99d7-1f5578ffae18\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" Apr 22 18:47:11.832036 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.831845 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-864cb4674-xgnfz"] Apr 22 18:47:11.834893 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:11.834869 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cfd4bf2_8f28_408b_a116_ad7939016998.slice/crio-79c1da4e72344a631805a60a0a550f94a375a9dc15859c613a5947f873fc4bf8 WatchSource:0}: Error finding container 79c1da4e72344a631805a60a0a550f94a375a9dc15859c613a5947f873fc4bf8: Status 404 returned error can't find the container with id 79c1da4e72344a631805a60a0a550f94a375a9dc15859c613a5947f873fc4bf8 Apr 22 18:47:11.884760 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:11.884730 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c42fbe22-3543-489f-99d7-1f5578ffae18-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4wbp6\" (UID: \"c42fbe22-3543-489f-99d7-1f5578ffae18\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" Apr 22 18:47:11.884900 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:11.884881 2562 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 18:47:11.884970 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:11.884959 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c42fbe22-3543-489f-99d7-1f5578ffae18-monitoring-plugin-cert podName:c42fbe22-3543-489f-99d7-1f5578ffae18 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:12.384943794 +0000 UTC m=+44.064170064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/c42fbe22-3543-489f-99d7-1f5578ffae18-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-4wbp6" (UID: "c42fbe22-3543-489f-99d7-1f5578ffae18") : secret "monitoring-plugin-cert" not found Apr 22 18:47:12.109195 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:12.109118 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jhkhr" event={"ID":"74910032-40a3-4178-a85b-c09cd90b2d70","Type":"ContainerStarted","Data":"92902bc47919f6950e4b39262d3d175fc4b08cab72d88a4484b46091f821085f"} Apr 22 18:47:12.109195 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:12.109162 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jhkhr" event={"ID":"74910032-40a3-4178-a85b-c09cd90b2d70","Type":"ContainerStarted","Data":"db50ae264f4f71e19aaceaa4c2760d24169324699fed864ecc0ce8571dceb1f0"} Apr 22 18:47:12.110592 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:12.110559 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" event={"ID":"7cfd4bf2-8f28-408b-a116-ad7939016998","Type":"ContainerStarted","Data":"79c1da4e72344a631805a60a0a550f94a375a9dc15859c613a5947f873fc4bf8"} Apr 22 18:47:12.129272 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:12.129234 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jhkhr" podStartSLOduration=3.512952509 podStartE2EDuration="6.129219422s" podCreationTimestamp="2026-04-22 18:47:06 +0000 UTC" firstStartedPulling="2026-04-22 18:47:07.954880597 +0000 UTC m=+39.634106867" lastFinishedPulling="2026-04-22 18:47:10.571147506 +0000 UTC m=+42.250373780" observedRunningTime="2026-04-22 18:47:12.127523689 +0000 UTC m=+43.806749980" watchObservedRunningTime="2026-04-22 18:47:12.129219422 +0000 UTC m=+43.808445715" Apr 22 18:47:12.390764 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:12.390680 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c42fbe22-3543-489f-99d7-1f5578ffae18-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4wbp6\" (UID: \"c42fbe22-3543-489f-99d7-1f5578ffae18\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" Apr 22 18:47:12.390914 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:12.390824 2562 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 18:47:12.390914 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:47:12.390905 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c42fbe22-3543-489f-99d7-1f5578ffae18-monitoring-plugin-cert podName:c42fbe22-3543-489f-99d7-1f5578ffae18 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:13.390888558 +0000 UTC m=+45.070114829 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/c42fbe22-3543-489f-99d7-1f5578ffae18-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-4wbp6" (UID: "c42fbe22-3543-489f-99d7-1f5578ffae18") : secret "monitoring-plugin-cert" not found Apr 22 18:47:13.159431 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.159407 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:47:13.172568 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.172546 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.175628 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.175492 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:47:13.175726 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.175699 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:47:13.176065 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.175844 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:47:13.176065 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.175914 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:47:13.176304 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.176163 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cqprc2pk0hk0r\"" Apr 22 18:47:13.176413 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.176366 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-t6bwc\"" Apr 22 18:47:13.177204 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.177073 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:47:13.177204 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.177083 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:47:13.177977 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.177955 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:47:13.178342 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.178232 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:47:13.178454 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.178379 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:47:13.178454 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.178433 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:47:13.179926 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.179787 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:47:13.180418 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.180350 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:47:13.182517 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.182496 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:47:13.197126 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197107 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197228 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197171 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197228 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197194 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197228 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197213 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197378 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197228 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197378 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197262 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197378 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197284 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbqz\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-kube-api-access-5dbqz\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197493 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197400 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config-out\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197493 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197439 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197493 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197464 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197493 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197483 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197644 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197503 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197644 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197520 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197644 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197597 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197816 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197675 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-web-config\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197816 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197712 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197816 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197742 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.197816 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.197795 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.299055 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.299000 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.299208 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.299183 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-web-config\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.299268 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.299219 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.299268 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.299249 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.299388 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.299293 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.299388 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.299325 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.299388 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.299381 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.299552 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.299412 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.299552 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.299442 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300042 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300201 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300263 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300302 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dbqz\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-kube-api-access-5dbqz\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300332 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config-out\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300357 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300364 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300423 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300451 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300483 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.302990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.300513 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.304062 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.304036 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.304987 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.304191 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-web-config\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.304987 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.304237 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.304987 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.304479 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.304987 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.304761 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.304987 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.304808 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.304987 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.304947 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.305385 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.305362 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.305912 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.305809 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.305912 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.305838 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.305912 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.305829 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.306323 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.306303 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.306408 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.306372 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.306631 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.306606 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config-out\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.307068 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.307031 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.314825 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.314808 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dbqz\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-kube-api-access-5dbqz\") pod \"prometheus-k8s-0\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.401351 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.401282 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c42fbe22-3543-489f-99d7-1f5578ffae18-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4wbp6\" (UID: \"c42fbe22-3543-489f-99d7-1f5578ffae18\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" Apr 22 18:47:13.410543 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.410512 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c42fbe22-3543-489f-99d7-1f5578ffae18-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4wbp6\" (UID: \"c42fbe22-3543-489f-99d7-1f5578ffae18\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" Apr 22 18:47:13.485269 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.485234 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:13.569882 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.569849 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" Apr 22 18:47:13.783706 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.783634 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:47:13.790182 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:13.790067 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod365d91ad_6d91_4995_b4f3_ff77c31fa9ab.slice/crio-380439a9b93ba912d422890ef08c344f70f4e39dc663fdc847a51fbd9e469ac5 WatchSource:0}: Error finding container 380439a9b93ba912d422890ef08c344f70f4e39dc663fdc847a51fbd9e469ac5: Status 404 returned error can't find the container with id 380439a9b93ba912d422890ef08c344f70f4e39dc663fdc847a51fbd9e469ac5 Apr 22 18:47:13.791127 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:13.791104 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6"] Apr 22 18:47:13.795608 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:13.795581 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42fbe22_3543_489f_99d7_1f5578ffae18.slice/crio-a67683698dcd1469c54210a992a3cc5f22d8c434352d3a6f18138176367850a5 WatchSource:0}: Error finding container a67683698dcd1469c54210a992a3cc5f22d8c434352d3a6f18138176367850a5: Status 404 returned error can't find the container with id a67683698dcd1469c54210a992a3cc5f22d8c434352d3a6f18138176367850a5 Apr 22 18:47:14.117834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:14.117799 2562 generic.go:358] "Generic (PLEG): container finished" podID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerID="fc42b7107435ba5fb20489b0c0e4d63a7b2eb41697213c25c305690f01045384" exitCode=0 Apr 22 18:47:14.117931 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:14.117895 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerDied","Data":"fc42b7107435ba5fb20489b0c0e4d63a7b2eb41697213c25c305690f01045384"} Apr 22 18:47:14.119435 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:14.119406 2562 generic.go:358] "Generic (PLEG): container finished" podID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerID="6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5" exitCode=0 Apr 22 18:47:14.119533 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:14.119498 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerDied","Data":"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5"} Apr 22 18:47:14.119533 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:14.119528 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerStarted","Data":"380439a9b93ba912d422890ef08c344f70f4e39dc663fdc847a51fbd9e469ac5"} Apr 22 18:47:14.122323 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:14.122295 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" event={"ID":"814b00bd-8708-4647-b738-0041818be65e","Type":"ContainerStarted","Data":"8b2e828cd36880b35b504eed12a89f9ffc0ed4c245183de1dff1d027a88a00c4"} Apr 22 18:47:14.122445 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:14.122328 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" event={"ID":"814b00bd-8708-4647-b738-0041818be65e","Type":"ContainerStarted","Data":"6092810b9b3d38dab92e72902c65fe5e26238813b0950ecb87b0067aa754276e"} Apr 22 18:47:14.122445 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:14.122368 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" event={"ID":"814b00bd-8708-4647-b738-0041818be65e","Type":"ContainerStarted","Data":"e99184a9be5a51215db0c75fd52a4f0ff901589c3bd7292323765c77d899b6cd"} Apr 22 18:47:14.123627 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:14.123598 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" event={"ID":"c42fbe22-3543-489f-99d7-1f5578ffae18","Type":"ContainerStarted","Data":"a67683698dcd1469c54210a992a3cc5f22d8c434352d3a6f18138176367850a5"} Apr 22 18:47:15.129479 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:15.129426 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" event={"ID":"7cfd4bf2-8f28-408b-a116-ad7939016998","Type":"ContainerStarted","Data":"f683d821211234224878d61612022429929f4b8d65d1ab0aed70e83d56323c71"} Apr 22 18:47:15.150189 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:15.150130 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" podStartSLOduration=1.9158816760000001 podStartE2EDuration="4.150103241s" podCreationTimestamp="2026-04-22 18:47:11 +0000 UTC" firstStartedPulling="2026-04-22 18:47:11.837244092 +0000 UTC m=+43.516470371" lastFinishedPulling="2026-04-22 18:47:14.071465657 +0000 UTC m=+45.750691936" observedRunningTime="2026-04-22 18:47:15.149556008 +0000 UTC m=+46.828782338" watchObservedRunningTime="2026-04-22 18:47:15.150103241 +0000 UTC m=+46.829329536" Apr 22 18:47:17.106092 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:17.106055 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j7m2r" Apr 22 18:47:18.141803 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:18.141692 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerStarted","Data":"68fca7d0bde21fdc541f42e9f0677eb9b91bacc017e255d44f1624a7ae6ac306"} Apr 22 18:47:18.141803 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:18.141737 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerStarted","Data":"921193cabf9a2aa6ad3c0286ce0f4e8bb867f286b6bec83a7cf9244478e62a78"} Apr 22 18:47:18.145214 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:18.145159 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerStarted","Data":"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933"} Apr 22 18:47:18.157296 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:18.157061 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" event={"ID":"814b00bd-8708-4647-b738-0041818be65e","Type":"ContainerStarted","Data":"c263a168942d2c41dbfe335d86dda6c05c2668d067f777551f327cb8b5e41137"} Apr 22 18:47:18.157296 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:18.157196 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" event={"ID":"814b00bd-8708-4647-b738-0041818be65e","Type":"ContainerStarted","Data":"32010b12a5139f04a835aea625042bf4ee5ef540f9531e0d792a67af64fe51c1"} Apr 22 18:47:18.160392 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:18.160367 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" event={"ID":"c42fbe22-3543-489f-99d7-1f5578ffae18","Type":"ContainerStarted","Data":"03cc9f97aa11536ec2ac76c7914f6f2944c7dc1a06868e29acdb1720d6081e87"} Apr 22 18:47:18.160866 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:18.160834 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" Apr 22 18:47:18.170718 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:18.170686 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" Apr 22 18:47:18.210973 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:18.210294 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4wbp6" podStartSLOduration=3.16858914 podStartE2EDuration="7.210275134s" podCreationTimestamp="2026-04-22 18:47:11 +0000 UTC" firstStartedPulling="2026-04-22 18:47:13.797912232 +0000 UTC m=+45.477138503" lastFinishedPulling="2026-04-22 18:47:17.839598216 +0000 UTC m=+49.518824497" observedRunningTime="2026-04-22 18:47:18.191600634 +0000 UTC m=+49.870826927" watchObservedRunningTime="2026-04-22 18:47:18.210275134 +0000 UTC m=+49.889501427" Apr 22 18:47:19.168060 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.168026 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" event={"ID":"814b00bd-8708-4647-b738-0041818be65e","Type":"ContainerStarted","Data":"2c6320cb2fc6140d1405336f205b9957faf33ddfdd33fc447314cfb3ed5c4ac3"} Apr 22 18:47:19.168484 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.168296 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:19.171146 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.171122 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerStarted","Data":"a1db6a7bbe206af89ade795e7da87a3f3e095b9a9d011d6696d009310742b2f9"} Apr 22 18:47:19.171275 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.171152 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerStarted","Data":"c5dcc0c64e1f3108a84a07d7eb6ba8a1b978deebdcadfcbee4186c19d036e09a"} Apr 22 18:47:19.171275 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.171164 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerStarted","Data":"bc619923e2f4739cf175f8819c1edb13ceb5e53bb8b4d740c3551fc091de763b"} Apr 22 18:47:19.171275 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.171173 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerStarted","Data":"f9b3ac77fd024b62fad77c87b83cd6c76bfe20b39f2d65f8d73e884af4d5eb85"} Apr 22 18:47:19.174419 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.174392 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerStarted","Data":"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5"} Apr 22 18:47:19.174498 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.174426 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerStarted","Data":"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105"} Apr 22 18:47:19.174498 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.174439 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerStarted","Data":"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103"} Apr 22 18:47:19.174498 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.174451 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerStarted","Data":"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a"} Apr 22 18:47:19.174597 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.174564 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" Apr 22 18:47:19.174597 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.174583 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerStarted","Data":"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104"} Apr 22 18:47:19.189066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.189000 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-64f989975c-2xc7s" podStartSLOduration=3.063790556 podStartE2EDuration="10.18898862s" podCreationTimestamp="2026-04-22 18:47:09 +0000 UTC" firstStartedPulling="2026-04-22 18:47:10.712601511 +0000 UTC m=+42.391827794" lastFinishedPulling="2026-04-22 18:47:17.837799587 +0000 UTC m=+49.517025858" observedRunningTime="2026-04-22 18:47:19.188621415 +0000 UTC m=+50.867847706" watchObservedRunningTime="2026-04-22 18:47:19.18898862 +0000 UTC m=+50.868214913" Apr 22 18:47:19.213482 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.213442 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.229891496 podStartE2EDuration="11.213429622s" podCreationTimestamp="2026-04-22 18:47:08 +0000 UTC" firstStartedPulling="2026-04-22 18:47:10.855681922 +0000 UTC m=+42.534908195" lastFinishedPulling="2026-04-22 18:47:17.83922005 +0000 UTC m=+49.518446321" observedRunningTime="2026-04-22 18:47:19.211492476 +0000 UTC m=+50.890718772" watchObservedRunningTime="2026-04-22 18:47:19.213429622 +0000 UTC m=+50.892655915" Apr 22 18:47:19.259630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:19.259589 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.53398624 podStartE2EDuration="6.259576602s" podCreationTimestamp="2026-04-22 18:47:13 +0000 UTC" firstStartedPulling="2026-04-22 18:47:14.120961494 +0000 UTC m=+45.800187774" lastFinishedPulling="2026-04-22 18:47:17.846551865 +0000 UTC m=+49.525778136" observedRunningTime="2026-04-22 18:47:19.258137211 +0000 UTC m=+50.937363503" watchObservedRunningTime="2026-04-22 18:47:19.259576602 +0000 UTC m=+50.938802893" Apr 22 18:47:23.486161 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:23.486130 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:28.046119 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:28.046083 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zf6t2" Apr 22 18:47:31.681379 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:31.681351 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:31.681847 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:31.681434 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:34.586673 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.586639 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:47:34.589345 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.589326 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:34.599615 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.599598 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:34.610856 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.610832 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47h8s\" (UniqueName: \"kubernetes.io/projected/0b614b94-c0eb-4f39-add0-c46922389f94-kube-api-access-47h8s\") pod \"network-check-target-9g8sf\" (UID: \"0b614b94-c0eb-4f39-add0-c46922389f94\") " pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:47:34.659382 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.659359 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4wrrh\"" Apr 22 18:47:34.667075 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.667058 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:47:34.687510 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.687487 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:47:34.689939 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.689909 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:34.699894 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.699873 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da9bff9a-df34-4fcf-9338-631fbb086e31-metrics-certs\") pod \"network-metrics-daemon-4vbxg\" (UID: \"da9bff9a-df34-4fcf-9338-631fbb086e31\") " pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:47:34.781568 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.781548 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9g8sf"] Apr 22 18:47:34.784148 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:34.784126 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b614b94_c0eb_4f39_add0_c46922389f94.slice/crio-1f5015a82ce6e4fab3939397b5b5942835391cf160757cb3a8dfe84aff4ca652 WatchSource:0}: Error finding container 1f5015a82ce6e4fab3939397b5b5942835391cf160757cb3a8dfe84aff4ca652: Status 404 returned error can't find the container with id 1f5015a82ce6e4fab3939397b5b5942835391cf160757cb3a8dfe84aff4ca652 Apr 22 18:47:34.954685 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.954665 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-scwsl\"" Apr 22 18:47:34.962781 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:34.962765 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vbxg" Apr 22 18:47:35.080833 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:35.080805 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vbxg"] Apr 22 18:47:35.084005 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:47:35.083980 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda9bff9a_df34_4fcf_9338_631fbb086e31.slice/crio-3d5cab908bb5a264c70b910f4b3616a2c62b774363ae9fe05b6ecef2ff272238 WatchSource:0}: Error finding container 3d5cab908bb5a264c70b910f4b3616a2c62b774363ae9fe05b6ecef2ff272238: Status 404 returned error can't find the container with id 3d5cab908bb5a264c70b910f4b3616a2c62b774363ae9fe05b6ecef2ff272238 Apr 22 18:47:35.219005 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:35.218924 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9g8sf" event={"ID":"0b614b94-c0eb-4f39-add0-c46922389f94","Type":"ContainerStarted","Data":"1f5015a82ce6e4fab3939397b5b5942835391cf160757cb3a8dfe84aff4ca652"} Apr 22 18:47:35.220991 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:35.220940 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vbxg" event={"ID":"da9bff9a-df34-4fcf-9338-631fbb086e31","Type":"ContainerStarted","Data":"3d5cab908bb5a264c70b910f4b3616a2c62b774363ae9fe05b6ecef2ff272238"} Apr 22 18:47:37.228214 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:37.228169 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vbxg" event={"ID":"da9bff9a-df34-4fcf-9338-631fbb086e31","Type":"ContainerStarted","Data":"909be45abc0341d12fa2af191411398e07b24b7ceaeff96de86e14801fd8b94a"} Apr 22 18:47:38.232220 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:38.232185 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vbxg" event={"ID":"da9bff9a-df34-4fcf-9338-631fbb086e31","Type":"ContainerStarted","Data":"25d6ccd60555d79ac4ed740e302231a8ada3750390c147e5fd9fa716d24f2c39"} Apr 22 18:47:38.233460 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:38.233438 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9g8sf" event={"ID":"0b614b94-c0eb-4f39-add0-c46922389f94","Type":"ContainerStarted","Data":"95b08920fe2330da5c7fdf2a4c8000e4a79c28d805ff845eadc1a1cb3d64b101"} Apr 22 18:47:38.233580 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:38.233543 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:47:38.248735 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:38.248687 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4vbxg" podStartSLOduration=67.938346464 podStartE2EDuration="1m9.248675554s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:47:35.085838205 +0000 UTC m=+66.765064475" lastFinishedPulling="2026-04-22 18:47:36.396167285 +0000 UTC m=+68.075393565" observedRunningTime="2026-04-22 18:47:38.247158919 +0000 UTC m=+69.926385210" watchObservedRunningTime="2026-04-22 18:47:38.248675554 +0000 UTC m=+69.927901846" Apr 22 18:47:38.262005 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:38.261962 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9g8sf" podStartSLOduration=66.537432252 podStartE2EDuration="1m9.261947467s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:47:34.786039528 +0000 UTC m=+66.465265798" lastFinishedPulling="2026-04-22 18:47:37.510554729 +0000 UTC m=+69.189781013" observedRunningTime="2026-04-22 18:47:38.261343239 +0000 UTC m=+69.940569545" watchObservedRunningTime="2026-04-22 18:47:38.261947467 +0000 UTC m=+69.941173759" Apr 22 18:47:51.686793 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:51.686693 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:47:51.690418 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:47:51.690391 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-864cb4674-xgnfz" Apr 22 18:48:09.239042 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:09.238994 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9g8sf" Apr 22 18:48:17.479922 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:17.479889 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:17.498399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:17.498372 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:18.376157 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:18.376130 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:27.201241 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.201205 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:48:27.201731 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.201614 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="alertmanager" containerID="cri-o://921193cabf9a2aa6ad3c0286ce0f4e8bb867f286b6bec83a7cf9244478e62a78" gracePeriod=120 Apr 22 18:48:27.201801 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.201703 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy-web" containerID="cri-o://f9b3ac77fd024b62fad77c87b83cd6c76bfe20b39f2d65f8d73e884af4d5eb85" gracePeriod=120 Apr 22 18:48:27.201801 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.201725 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="config-reloader" containerID="cri-o://68fca7d0bde21fdc541f42e9f0677eb9b91bacc017e255d44f1624a7ae6ac306" gracePeriod=120 Apr 22 18:48:27.201801 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.201794 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy" containerID="cri-o://bc619923e2f4739cf175f8819c1edb13ceb5e53bb8b4d740c3551fc091de763b" gracePeriod=120 Apr 22 18:48:27.201958 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.201697 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy-metric" containerID="cri-o://c5dcc0c64e1f3108a84a07d7eb6ba8a1b978deebdcadfcbee4186c19d036e09a" gracePeriod=120 Apr 22 18:48:27.201958 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.201839 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="prom-label-proxy" containerID="cri-o://a1db6a7bbe206af89ade795e7da87a3f3e095b9a9d011d6696d009310742b2f9" gracePeriod=120 Apr 22 18:48:27.387905 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.387871 2562 generic.go:358] "Generic (PLEG): container finished" podID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerID="a1db6a7bbe206af89ade795e7da87a3f3e095b9a9d011d6696d009310742b2f9" exitCode=0 Apr 22 18:48:27.387905 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.387899 2562 generic.go:358] "Generic (PLEG): container finished" podID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerID="c5dcc0c64e1f3108a84a07d7eb6ba8a1b978deebdcadfcbee4186c19d036e09a" exitCode=0 Apr 22 18:48:27.387905 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.387905 2562 generic.go:358] "Generic (PLEG): container finished" podID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerID="bc619923e2f4739cf175f8819c1edb13ceb5e53bb8b4d740c3551fc091de763b" exitCode=0 Apr 22 18:48:27.387905 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.387910 2562 generic.go:358] "Generic (PLEG): container finished" podID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerID="68fca7d0bde21fdc541f42e9f0677eb9b91bacc017e255d44f1624a7ae6ac306" exitCode=0 Apr 22 18:48:27.388158 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.387915 2562 generic.go:358] "Generic (PLEG): container finished" podID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerID="921193cabf9a2aa6ad3c0286ce0f4e8bb867f286b6bec83a7cf9244478e62a78" exitCode=0 Apr 22 18:48:27.388158 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.387953 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerDied","Data":"a1db6a7bbe206af89ade795e7da87a3f3e095b9a9d011d6696d009310742b2f9"} Apr 22 18:48:27.388158 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.387996 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerDied","Data":"c5dcc0c64e1f3108a84a07d7eb6ba8a1b978deebdcadfcbee4186c19d036e09a"} Apr 22 18:48:27.388158 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.388010 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerDied","Data":"bc619923e2f4739cf175f8819c1edb13ceb5e53bb8b4d740c3551fc091de763b"} Apr 22 18:48:27.388158 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.388042 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerDied","Data":"68fca7d0bde21fdc541f42e9f0677eb9b91bacc017e255d44f1624a7ae6ac306"} Apr 22 18:48:27.388158 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:27.388054 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerDied","Data":"921193cabf9a2aa6ad3c0286ce0f4e8bb867f286b6bec83a7cf9244478e62a78"} Apr 22 18:48:28.394135 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.394107 2562 generic.go:358] "Generic (PLEG): container finished" podID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerID="f9b3ac77fd024b62fad77c87b83cd6c76bfe20b39f2d65f8d73e884af4d5eb85" exitCode=0 Apr 22 18:48:28.394463 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.394147 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerDied","Data":"f9b3ac77fd024b62fad77c87b83cd6c76bfe20b39f2d65f8d73e884af4d5eb85"} Apr 22 18:48:28.437336 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.437315 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:28.502347 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502319 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-out\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502503 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502371 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-metrics-client-ca\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502503 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502420 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-web\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502503 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502451 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-main-tls\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502676 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502565 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-cluster-tls-config\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502676 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502622 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-tls-assets\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502676 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502671 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-web-config\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502835 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502721 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502835 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502750 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-trusted-ca-bundle\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502835 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502782 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-volume\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.502835 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502827 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbf5f\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-kube-api-access-tbf5f\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.503066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502866 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.503066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.502913 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-main-db\") pod \"d7f3508d-ae22-4e4a-bae0-0df307c79003\" (UID: \"d7f3508d-ae22-4e4a-bae0-0df307c79003\") " Apr 22 18:48:28.503174 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.503009 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:28.503233 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.503217 2562 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-metrics-client-ca\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.503888 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.503494 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:48:28.503888 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.503845 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:28.505424 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.505372 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:28.505521 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.505437 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:28.505894 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.505860 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-out" (OuterVolumeSpecName: "config-out") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:48:28.506582 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.506559 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:48:28.506661 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.506600 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-volume" (OuterVolumeSpecName: "config-volume") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:28.507117 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.507086 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:28.507291 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.507256 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-kube-api-access-tbf5f" (OuterVolumeSpecName: "kube-api-access-tbf5f") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "kube-api-access-tbf5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:48:28.507394 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.507295 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:28.510456 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.510429 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:28.517273 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.517209 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-web-config" (OuterVolumeSpecName: "web-config") pod "d7f3508d-ae22-4e4a-bae0-0df307c79003" (UID: "d7f3508d-ae22-4e4a-bae0-0df307c79003"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:28.603545 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603519 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-main-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603545 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603541 2562 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-cluster-tls-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603551 2562 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-tls-assets\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603560 2562 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-web-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603569 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603579 2562 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603587 2562 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-volume\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603596 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbf5f\" (UniqueName: \"kubernetes.io/projected/d7f3508d-ae22-4e4a-bae0-0df307c79003-kube-api-access-tbf5f\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603606 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603616 2562 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-alertmanager-main-db\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603624 2562 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d7f3508d-ae22-4e4a-bae0-0df307c79003-config-out\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:28.603668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:28.603633 2562 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d7f3508d-ae22-4e4a-bae0-0df307c79003-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:29.399380 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.399350 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d7f3508d-ae22-4e4a-bae0-0df307c79003","Type":"ContainerDied","Data":"5650c9910a8d48dd5282ec4bfdb29d0404771f79ab87e9ac27e15a80019f64da"} Apr 22 18:48:29.399731 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.399396 2562 scope.go:117] "RemoveContainer" containerID="a1db6a7bbe206af89ade795e7da87a3f3e095b9a9d011d6696d009310742b2f9" Apr 22 18:48:29.399731 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.399466 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.407688 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.407670 2562 scope.go:117] "RemoveContainer" containerID="c5dcc0c64e1f3108a84a07d7eb6ba8a1b978deebdcadfcbee4186c19d036e09a" Apr 22 18:48:29.414220 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.414203 2562 scope.go:117] "RemoveContainer" containerID="bc619923e2f4739cf175f8819c1edb13ceb5e53bb8b4d740c3551fc091de763b" Apr 22 18:48:29.420602 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.420576 2562 scope.go:117] "RemoveContainer" containerID="f9b3ac77fd024b62fad77c87b83cd6c76bfe20b39f2d65f8d73e884af4d5eb85" Apr 22 18:48:29.422695 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.422674 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:48:29.427432 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.427410 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:48:29.428603 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.428587 2562 scope.go:117] "RemoveContainer" containerID="68fca7d0bde21fdc541f42e9f0677eb9b91bacc017e255d44f1624a7ae6ac306" Apr 22 18:48:29.434469 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.434451 2562 scope.go:117] "RemoveContainer" containerID="921193cabf9a2aa6ad3c0286ce0f4e8bb867f286b6bec83a7cf9244478e62a78" Apr 22 18:48:29.440464 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.440450 2562 scope.go:117] "RemoveContainer" containerID="fc42b7107435ba5fb20489b0c0e4d63a7b2eb41697213c25c305690f01045384" Apr 22 18:48:29.452674 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.452656 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:48:29.452949 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.452937 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy" Apr 22 18:48:29.452991 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.452951 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy" Apr 22 18:48:29.452991 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.452963 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="prom-label-proxy" Apr 22 18:48:29.452991 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.452969 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="prom-label-proxy" Apr 22 18:48:29.452991 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.452979 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy-web" Apr 22 18:48:29.452991 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.452984 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy-web" Apr 22 18:48:29.452991 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.452992 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="init-config-reloader" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.452997 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="init-config-reloader" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453004 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy-metric" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453009 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy-metric" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453037 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="alertmanager" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453043 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="alertmanager" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453050 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="config-reloader" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453054 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="config-reloader" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453115 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="prom-label-proxy" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453126 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy-metric" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453133 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="alertmanager" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453140 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="config-reloader" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453146 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy-web" Apr 22 18:48:29.453199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.453152 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" containerName="kube-rbac-proxy" Apr 22 18:48:29.457961 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.457944 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.460622 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.460603 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:48:29.460715 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.460701 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:48:29.460775 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.460716 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:48:29.460833 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.460817 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:48:29.460924 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.460910 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-czfzj\"" Apr 22 18:48:29.460990 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.460975 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:48:29.461067 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.460994 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:48:29.461268 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.461247 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:48:29.461268 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.461262 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:48:29.465914 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.465897 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:48:29.469110 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.469090 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:48:29.510021 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.509996 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510113 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510044 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510113 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510063 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-config-volume\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510120 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7913899-f23d-44b1-b927-ea1c19e65898-config-out\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510157 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-web-config\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510183 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510181 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7913899-f23d-44b1-b927-ea1c19e65898-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510276 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510214 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7913899-f23d-44b1-b927-ea1c19e65898-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510276 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510239 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhjg5\" (UniqueName: \"kubernetes.io/projected/f7913899-f23d-44b1-b927-ea1c19e65898-kube-api-access-fhjg5\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510276 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510258 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510376 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510277 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510376 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510298 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f7913899-f23d-44b1-b927-ea1c19e65898-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510376 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510313 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7913899-f23d-44b1-b927-ea1c19e65898-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.510376 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.510328 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.610674 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.610620 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f7913899-f23d-44b1-b927-ea1c19e65898-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.610674 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.610648 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7913899-f23d-44b1-b927-ea1c19e65898-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.610674 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.610666 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.610847 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.610695 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.610847 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.610713 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.610847 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.610733 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-config-volume\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.610847 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.610759 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7913899-f23d-44b1-b927-ea1c19e65898-config-out\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.611124 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.611102 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f7913899-f23d-44b1-b927-ea1c19e65898-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.611384 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.611362 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-web-config\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.611573 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.611557 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7913899-f23d-44b1-b927-ea1c19e65898-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.611728 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.611696 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7913899-f23d-44b1-b927-ea1c19e65898-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.611834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.611820 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhjg5\" (UniqueName: \"kubernetes.io/projected/f7913899-f23d-44b1-b927-ea1c19e65898-kube-api-access-fhjg5\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.611956 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.611939 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.612110 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.612092 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.612753 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.612430 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7913899-f23d-44b1-b927-ea1c19e65898-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.614066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.613680 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7913899-f23d-44b1-b927-ea1c19e65898-config-out\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.614066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.613845 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-config-volume\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.614066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.611841 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7913899-f23d-44b1-b927-ea1c19e65898-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.614066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.613991 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.614066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.613995 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.614866 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.614842 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.615203 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.615178 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.615361 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.615339 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-web-config\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.616364 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.616344 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7913899-f23d-44b1-b927-ea1c19e65898-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.616455 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.616350 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7913899-f23d-44b1-b927-ea1c19e65898-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.629670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.629646 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhjg5\" (UniqueName: \"kubernetes.io/projected/f7913899-f23d-44b1-b927-ea1c19e65898-kube-api-access-fhjg5\") pod \"alertmanager-main-0\" (UID: \"f7913899-f23d-44b1-b927-ea1c19e65898\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.767193 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.767158 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:48:29.888330 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:29.888303 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:48:29.890006 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:48:29.889976 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7913899_f23d_44b1_b927_ea1c19e65898.slice/crio-3f2d8585a3d61faf1fb92cbfb7e08f895a5d66cc6a0c77bfc1bf8cf1accea17e WatchSource:0}: Error finding container 3f2d8585a3d61faf1fb92cbfb7e08f895a5d66cc6a0c77bfc1bf8cf1accea17e: Status 404 returned error can't find the container with id 3f2d8585a3d61faf1fb92cbfb7e08f895a5d66cc6a0c77bfc1bf8cf1accea17e Apr 22 18:48:30.404669 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:30.404638 2562 generic.go:358] "Generic (PLEG): container finished" podID="f7913899-f23d-44b1-b927-ea1c19e65898" containerID="15e4de270095f9904a7753ed95be11005f22d95de7ba91bdf1efc6f85750b96c" exitCode=0 Apr 22 18:48:30.405051 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:30.404709 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7913899-f23d-44b1-b927-ea1c19e65898","Type":"ContainerDied","Data":"15e4de270095f9904a7753ed95be11005f22d95de7ba91bdf1efc6f85750b96c"} Apr 22 18:48:30.405051 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:30.404729 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7913899-f23d-44b1-b927-ea1c19e65898","Type":"ContainerStarted","Data":"3f2d8585a3d61faf1fb92cbfb7e08f895a5d66cc6a0c77bfc1bf8cf1accea17e"} Apr 22 18:48:30.843686 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:30.843657 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f3508d-ae22-4e4a-bae0-0df307c79003" path="/var/lib/kubelet/pods/d7f3508d-ae22-4e4a-bae0-0df307c79003/volumes" Apr 22 18:48:31.224671 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.224641 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb"] Apr 22 18:48:31.228231 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.228211 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.230708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.230685 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:48:31.230853 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.230724 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:48:31.230853 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.230783 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:48:31.230853 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.230804 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:48:31.231136 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.231119 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-qzcdl\"" Apr 22 18:48:31.231252 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.231236 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:48:31.236459 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.236438 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:48:31.240317 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.240297 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb"] Apr 22 18:48:31.325482 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.325457 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.325584 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.325487 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d922fb92-79c5-4a55-a2b5-321fe55b5381-serving-certs-ca-bundle\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.325584 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.325509 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lx4c\" (UniqueName: \"kubernetes.io/projected/d922fb92-79c5-4a55-a2b5-321fe55b5381-kube-api-access-2lx4c\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.325584 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.325528 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-telemeter-client-tls\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.325584 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.325570 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-secret-telemeter-client\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.325713 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.325632 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d922fb92-79c5-4a55-a2b5-321fe55b5381-metrics-client-ca\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.325713 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.325669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d922fb92-79c5-4a55-a2b5-321fe55b5381-telemeter-trusted-ca-bundle\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.325713 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.325700 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-federate-client-tls\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.410920 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.410893 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7913899-f23d-44b1-b927-ea1c19e65898","Type":"ContainerStarted","Data":"0b095b9e12c4e54a01c2a1aed789ae5411cefdb21e59b26abaeb09f2114c19f8"} Apr 22 18:48:31.411219 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.410927 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7913899-f23d-44b1-b927-ea1c19e65898","Type":"ContainerStarted","Data":"3c79acae11bb2250638802edd39c1b992e5e4dc403cdbbe6fabb197b5d51c69a"} Apr 22 18:48:31.411219 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.410943 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7913899-f23d-44b1-b927-ea1c19e65898","Type":"ContainerStarted","Data":"63bb9b78225cd469d372dd43a890477c44e12d8e6ae7182422e5b0e0ed25e453"} Apr 22 18:48:31.411219 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.410955 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7913899-f23d-44b1-b927-ea1c19e65898","Type":"ContainerStarted","Data":"8ff6505388f2aa60d2f9fd172b52cb631e1476d855780f87d8bef67579716008"} Apr 22 18:48:31.411219 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.410966 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7913899-f23d-44b1-b927-ea1c19e65898","Type":"ContainerStarted","Data":"a6426a777ae17a8f3a4b16c58da6cdcbb7f9e01131519e91dcf1a0bd5416c4a0"} Apr 22 18:48:31.411219 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.410978 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7913899-f23d-44b1-b927-ea1c19e65898","Type":"ContainerStarted","Data":"c56a9c2b31dc19a569e9ccc1836397c0367dd9dc5e63996414eb96315eccec38"} Apr 22 18:48:31.426751 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.426730 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-secret-telemeter-client\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.426828 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.426768 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d922fb92-79c5-4a55-a2b5-321fe55b5381-metrics-client-ca\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.426828 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.426786 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d922fb92-79c5-4a55-a2b5-321fe55b5381-telemeter-trusted-ca-bundle\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.426828 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.426810 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-federate-client-tls\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.426963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.426849 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.426963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.426867 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d922fb92-79c5-4a55-a2b5-321fe55b5381-serving-certs-ca-bundle\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.426963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.426883 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lx4c\" (UniqueName: \"kubernetes.io/projected/d922fb92-79c5-4a55-a2b5-321fe55b5381-kube-api-access-2lx4c\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.426963 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.426951 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-telemeter-client-tls\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.427815 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.427793 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d922fb92-79c5-4a55-a2b5-321fe55b5381-metrics-client-ca\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.427815 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.427802 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d922fb92-79c5-4a55-a2b5-321fe55b5381-serving-certs-ca-bundle\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.427945 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.427802 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d922fb92-79c5-4a55-a2b5-321fe55b5381-telemeter-trusted-ca-bundle\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.429811 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.429787 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.429895 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.429836 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-secret-telemeter-client\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.429988 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.429967 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-telemeter-client-tls\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.430045 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.429984 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d922fb92-79c5-4a55-a2b5-321fe55b5381-federate-client-tls\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.436845 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.436803 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.4367885449999998 podStartE2EDuration="2.436788545s" podCreationTimestamp="2026-04-22 18:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:31.434910308 +0000 UTC m=+123.114136599" watchObservedRunningTime="2026-04-22 18:48:31.436788545 +0000 UTC m=+123.116014839" Apr 22 18:48:31.445037 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.443289 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lx4c\" (UniqueName: \"kubernetes.io/projected/d922fb92-79c5-4a55-a2b5-321fe55b5381-kube-api-access-2lx4c\") pod \"telemeter-client-76cf6dcdcf-2tgsb\" (UID: \"d922fb92-79c5-4a55-a2b5-321fe55b5381\") " pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.518321 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.518257 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:48:31.518720 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.518655 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="prometheus" containerID="cri-o://87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" gracePeriod=600 Apr 22 18:48:31.518816 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.518702 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="config-reloader" containerID="cri-o://6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" gracePeriod=600 Apr 22 18:48:31.518816 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.518715 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy-web" containerID="cri-o://9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" gracePeriod=600 Apr 22 18:48:31.518816 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.518690 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy" containerID="cri-o://ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" gracePeriod=600 Apr 22 18:48:31.518816 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.518748 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy-thanos" containerID="cri-o://2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" gracePeriod=600 Apr 22 18:48:31.519008 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.518848 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="thanos-sidecar" containerID="cri-o://ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" gracePeriod=600 Apr 22 18:48:31.538706 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.538686 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" Apr 22 18:48:31.680428 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.680400 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb"] Apr 22 18:48:31.682480 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:48:31.682457 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd922fb92_79c5_4a55_a2b5_321fe55b5381.slice/crio-f59ff104b1c5a1f562d7b4275c25aaa0057429f5d6e9823c3221bc06bbbacd08 WatchSource:0}: Error finding container f59ff104b1c5a1f562d7b4275c25aaa0057429f5d6e9823c3221bc06bbbacd08: Status 404 returned error can't find the container with id f59ff104b1c5a1f562d7b4275c25aaa0057429f5d6e9823c3221bc06bbbacd08 Apr 22 18:48:31.765755 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.765735 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:31.830111 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830043 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-grpc-tls\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830111 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830075 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dbqz\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-kube-api-access-5dbqz\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830283 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830113 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830283 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830136 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-thanos-prometheus-http-client-file\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830283 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830162 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-trusted-ca-bundle\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830283 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830184 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-web-config\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830283 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830207 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830283 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830240 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-rulefiles-0\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830283 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830264 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-metrics-client-certs\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830305 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config-out\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830355 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-kubelet-serving-ca-bundle\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830380 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-kube-rbac-proxy\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830420 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-tls\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830449 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830491 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-db\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830521 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-tls-assets\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830552 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-metrics-client-ca\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.830630 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830581 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-serving-certs-ca-bundle\") pod \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\" (UID: \"365d91ad-6d91-4995-b4f3-ff77c31fa9ab\") " Apr 22 18:48:31.831086 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.830864 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:31.831146 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.831128 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:31.831325 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.831302 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:31.832276 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.832246 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:31.832882 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.832851 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:31.833083 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.832988 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:31.833499 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.833471 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:48:31.833879 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.833847 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:31.834080 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.833915 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-kube-api-access-5dbqz" (OuterVolumeSpecName: "kube-api-access-5dbqz") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "kube-api-access-5dbqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:48:31.834080 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.833984 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config-out" (OuterVolumeSpecName: "config-out") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:48:31.834495 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.834433 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:31.834495 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.834462 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config" (OuterVolumeSpecName: "config") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:31.835078 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.835002 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:31.835289 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.835267 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:31.835467 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.835441 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:48:31.835801 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.835779 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:31.836505 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.836476 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:31.844830 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.844810 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-web-config" (OuterVolumeSpecName: "web-config") pod "365d91ad-6d91-4995-b4f3-ff77c31fa9ab" (UID: "365d91ad-6d91-4995-b4f3-ff77c31fa9ab"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:31.931602 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931582 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931602 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931602 2562 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-grpc-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931611 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5dbqz\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-kube-api-access-5dbqz\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931621 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931630 2562 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931639 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931648 2562 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-web-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931657 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931665 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931676 2562 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-metrics-client-certs\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931685 2562 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config-out\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931694 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931703 2562 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-kube-rbac-proxy\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931713 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931721 2562 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.931724 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931730 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-prometheus-k8s-db\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.932138 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931740 2562 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-tls-assets\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:31.932138 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:31.931748 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/365d91ad-6d91-4995-b4f3-ff77c31fa9ab-configmap-metrics-client-ca\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:48:32.415083 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.415048 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" event={"ID":"d922fb92-79c5-4a55-a2b5-321fe55b5381","Type":"ContainerStarted","Data":"f59ff104b1c5a1f562d7b4275c25aaa0057429f5d6e9823c3221bc06bbbacd08"} Apr 22 18:48:32.418426 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418395 2562 generic.go:358] "Generic (PLEG): container finished" podID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerID="2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" exitCode=0 Apr 22 18:48:32.418426 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418427 2562 generic.go:358] "Generic (PLEG): container finished" podID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerID="ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" exitCode=0 Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418438 2562 generic.go:358] "Generic (PLEG): container finished" podID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerID="9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" exitCode=0 Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418447 2562 generic.go:358] "Generic (PLEG): container finished" podID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerID="ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" exitCode=0 Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418459 2562 generic.go:358] "Generic (PLEG): container finished" podID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerID="6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" exitCode=0 Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418467 2562 generic.go:358] "Generic (PLEG): container finished" podID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerID="87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" exitCode=0 Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418486 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerDied","Data":"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5"} Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418527 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerDied","Data":"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105"} Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418533 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418541 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerDied","Data":"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103"} Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418556 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerDied","Data":"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a"} Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418568 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerDied","Data":"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104"} Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418580 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerDied","Data":"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933"} Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418594 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"365d91ad-6d91-4995-b4f3-ff77c31fa9ab","Type":"ContainerDied","Data":"380439a9b93ba912d422890ef08c344f70f4e39dc663fdc847a51fbd9e469ac5"} Apr 22 18:48:32.418616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.418597 2562 scope.go:117] "RemoveContainer" containerID="2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" Apr 22 18:48:32.426886 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.426864 2562 scope.go:117] "RemoveContainer" containerID="ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" Apr 22 18:48:32.434847 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.434824 2562 scope.go:117] "RemoveContainer" containerID="9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" Apr 22 18:48:32.444185 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.444123 2562 scope.go:117] "RemoveContainer" containerID="ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" Apr 22 18:48:32.446793 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.446770 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:48:32.451324 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.451280 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:48:32.453141 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.453085 2562 scope.go:117] "RemoveContainer" containerID="6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" Apr 22 18:48:32.460129 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.460102 2562 scope.go:117] "RemoveContainer" containerID="87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" Apr 22 18:48:32.468007 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.467988 2562 scope.go:117] "RemoveContainer" containerID="6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5" Apr 22 18:48:32.475247 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475228 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:48:32.475676 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475651 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="prometheus" Apr 22 18:48:32.475676 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475669 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="prometheus" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475686 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="init-config-reloader" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475696 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="init-config-reloader" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475704 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="config-reloader" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475712 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="config-reloader" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475726 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy-web" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475734 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy-web" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475747 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475755 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475771 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy-thanos" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475779 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy-thanos" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475792 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="thanos-sidecar" Apr 22 18:48:32.475812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475800 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="thanos-sidecar" Apr 22 18:48:32.476410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475871 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="config-reloader" Apr 22 18:48:32.476410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475885 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy-thanos" Apr 22 18:48:32.476410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475896 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="prometheus" Apr 22 18:48:32.476410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475906 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="thanos-sidecar" Apr 22 18:48:32.476410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475918 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy-web" Apr 22 18:48:32.476410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.475930 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" containerName="kube-rbac-proxy" Apr 22 18:48:32.476739 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.476424 2562 scope.go:117] "RemoveContainer" containerID="2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" Apr 22 18:48:32.476739 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:48:32.476697 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": container with ID starting with 2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5 not found: ID does not exist" containerID="2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" Apr 22 18:48:32.476830 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.476731 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5"} err="failed to get container status \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": rpc error: code = NotFound desc = could not find container \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": container with ID starting with 2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5 not found: ID does not exist" Apr 22 18:48:32.476830 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.476768 2562 scope.go:117] "RemoveContainer" containerID="ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" Apr 22 18:48:32.477051 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:48:32.477007 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": container with ID starting with ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105 not found: ID does not exist" containerID="ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" Apr 22 18:48:32.477154 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.477059 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105"} err="failed to get container status \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": rpc error: code = NotFound desc = could not find container \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": container with ID starting with ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105 not found: ID does not exist" Apr 22 18:48:32.477154 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.477080 2562 scope.go:117] "RemoveContainer" containerID="9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" Apr 22 18:48:32.477462 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:48:32.477413 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": container with ID starting with 9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103 not found: ID does not exist" containerID="9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" Apr 22 18:48:32.477556 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.477469 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103"} err="failed to get container status \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": rpc error: code = NotFound desc = could not find container \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": container with ID starting with 9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103 not found: ID does not exist" Apr 22 18:48:32.477556 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.477492 2562 scope.go:117] "RemoveContainer" containerID="ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" Apr 22 18:48:32.477880 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:48:32.477856 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": container with ID starting with ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a not found: ID does not exist" containerID="ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" Apr 22 18:48:32.477972 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.477887 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a"} err="failed to get container status \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": rpc error: code = NotFound desc = could not find container \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": container with ID starting with ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a not found: ID does not exist" Apr 22 18:48:32.477972 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.477909 2562 scope.go:117] "RemoveContainer" containerID="6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" Apr 22 18:48:32.478280 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:48:32.478261 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": container with ID starting with 6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104 not found: ID does not exist" containerID="6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" Apr 22 18:48:32.478373 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.478285 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104"} err="failed to get container status \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": rpc error: code = NotFound desc = could not find container \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": container with ID starting with 6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104 not found: ID does not exist" Apr 22 18:48:32.478373 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.478303 2562 scope.go:117] "RemoveContainer" containerID="87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" Apr 22 18:48:32.478561 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:48:32.478539 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": container with ID starting with 87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933 not found: ID does not exist" containerID="87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" Apr 22 18:48:32.478611 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.478571 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933"} err="failed to get container status \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": rpc error: code = NotFound desc = could not find container \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": container with ID starting with 87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933 not found: ID does not exist" Apr 22 18:48:32.478611 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.478592 2562 scope.go:117] "RemoveContainer" containerID="6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5" Apr 22 18:48:32.478860 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:48:32.478839 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": container with ID starting with 6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5 not found: ID does not exist" containerID="6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5" Apr 22 18:48:32.478937 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.478865 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5"} err="failed to get container status \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": rpc error: code = NotFound desc = could not find container \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": container with ID starting with 6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5 not found: ID does not exist" Apr 22 18:48:32.478937 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.478883 2562 scope.go:117] "RemoveContainer" containerID="2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" Apr 22 18:48:32.479124 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.479105 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5"} err="failed to get container status \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": rpc error: code = NotFound desc = could not find container \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": container with ID starting with 2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5 not found: ID does not exist" Apr 22 18:48:32.479192 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.479127 2562 scope.go:117] "RemoveContainer" containerID="ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" Apr 22 18:48:32.479379 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.479359 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105"} err="failed to get container status \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": rpc error: code = NotFound desc = could not find container \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": container with ID starting with ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105 not found: ID does not exist" Apr 22 18:48:32.479440 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.479382 2562 scope.go:117] "RemoveContainer" containerID="9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" Apr 22 18:48:32.479598 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.479582 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103"} err="failed to get container status \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": rpc error: code = NotFound desc = could not find container \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": container with ID starting with 9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103 not found: ID does not exist" Apr 22 18:48:32.479598 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.479598 2562 scope.go:117] "RemoveContainer" containerID="ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" Apr 22 18:48:32.479838 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.479805 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a"} err="failed to get container status \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": rpc error: code = NotFound desc = could not find container \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": container with ID starting with ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a not found: ID does not exist" Apr 22 18:48:32.479838 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.479836 2562 scope.go:117] "RemoveContainer" containerID="6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" Apr 22 18:48:32.480056 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.480037 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104"} err="failed to get container status \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": rpc error: code = NotFound desc = could not find container \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": container with ID starting with 6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104 not found: ID does not exist" Apr 22 18:48:32.480120 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.480058 2562 scope.go:117] "RemoveContainer" containerID="87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" Apr 22 18:48:32.480324 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.480305 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933"} err="failed to get container status \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": rpc error: code = NotFound desc = could not find container \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": container with ID starting with 87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933 not found: ID does not exist" Apr 22 18:48:32.480371 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.480324 2562 scope.go:117] "RemoveContainer" containerID="6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5" Apr 22 18:48:32.480585 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.480554 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5"} err="failed to get container status \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": rpc error: code = NotFound desc = could not find container \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": container with ID starting with 6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5 not found: ID does not exist" Apr 22 18:48:32.480585 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.480578 2562 scope.go:117] "RemoveContainer" containerID="2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" Apr 22 18:48:32.480784 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.480760 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5"} err="failed to get container status \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": rpc error: code = NotFound desc = could not find container \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": container with ID starting with 2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5 not found: ID does not exist" Apr 22 18:48:32.480838 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.480785 2562 scope.go:117] "RemoveContainer" containerID="ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" Apr 22 18:48:32.481119 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.481093 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105"} err="failed to get container status \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": rpc error: code = NotFound desc = could not find container \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": container with ID starting with ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105 not found: ID does not exist" Apr 22 18:48:32.481119 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.481118 2562 scope.go:117] "RemoveContainer" containerID="9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" Apr 22 18:48:32.481372 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.481351 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103"} err="failed to get container status \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": rpc error: code = NotFound desc = could not find container \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": container with ID starting with 9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103 not found: ID does not exist" Apr 22 18:48:32.481372 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.481373 2562 scope.go:117] "RemoveContainer" containerID="ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" Apr 22 18:48:32.481497 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.481481 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.481584 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.481565 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a"} err="failed to get container status \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": rpc error: code = NotFound desc = could not find container \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": container with ID starting with ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a not found: ID does not exist" Apr 22 18:48:32.481644 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.481586 2562 scope.go:117] "RemoveContainer" containerID="6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" Apr 22 18:48:32.481998 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.481842 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104"} err="failed to get container status \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": rpc error: code = NotFound desc = could not find container \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": container with ID starting with 6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104 not found: ID does not exist" Apr 22 18:48:32.481998 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.481872 2562 scope.go:117] "RemoveContainer" containerID="87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" Apr 22 18:48:32.482197 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.482175 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933"} err="failed to get container status \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": rpc error: code = NotFound desc = could not find container \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": container with ID starting with 87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933 not found: ID does not exist" Apr 22 18:48:32.482261 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.482200 2562 scope.go:117] "RemoveContainer" containerID="6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5" Apr 22 18:48:32.482444 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.482420 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5"} err="failed to get container status \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": rpc error: code = NotFound desc = could not find container \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": container with ID starting with 6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5 not found: ID does not exist" Apr 22 18:48:32.482534 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.482445 2562 scope.go:117] "RemoveContainer" containerID="2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" Apr 22 18:48:32.482717 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.482691 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5"} err="failed to get container status \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": rpc error: code = NotFound desc = could not find container \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": container with ID starting with 2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5 not found: ID does not exist" Apr 22 18:48:32.482792 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.482719 2562 scope.go:117] "RemoveContainer" containerID="ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" Apr 22 18:48:32.483028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.482985 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105"} err="failed to get container status \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": rpc error: code = NotFound desc = could not find container \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": container with ID starting with ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105 not found: ID does not exist" Apr 22 18:48:32.483028 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.483026 2562 scope.go:117] "RemoveContainer" containerID="9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" Apr 22 18:48:32.483316 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.483284 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103"} err="failed to get container status \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": rpc error: code = NotFound desc = could not find container \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": container with ID starting with 9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103 not found: ID does not exist" Apr 22 18:48:32.483316 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.483305 2562 scope.go:117] "RemoveContainer" containerID="ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" Apr 22 18:48:32.483569 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.483539 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a"} err="failed to get container status \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": rpc error: code = NotFound desc = could not find container \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": container with ID starting with ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a not found: ID does not exist" Apr 22 18:48:32.483569 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.483564 2562 scope.go:117] "RemoveContainer" containerID="6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" Apr 22 18:48:32.483834 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.483810 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104"} err="failed to get container status \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": rpc error: code = NotFound desc = could not find container \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": container with ID starting with 6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104 not found: ID does not exist" Apr 22 18:48:32.483928 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.483836 2562 scope.go:117] "RemoveContainer" containerID="87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" Apr 22 18:48:32.483928 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.483903 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:48:32.484122 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484060 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933"} err="failed to get container status \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": rpc error: code = NotFound desc = could not find container \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": container with ID starting with 87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933 not found: ID does not exist" Apr 22 18:48:32.484122 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484085 2562 scope.go:117] "RemoveContainer" containerID="6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5" Apr 22 18:48:32.484632 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484251 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:48:32.484632 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484354 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cqprc2pk0hk0r\"" Apr 22 18:48:32.484632 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484356 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5"} err="failed to get container status \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": rpc error: code = NotFound desc = could not find container \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": container with ID starting with 6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5 not found: ID does not exist" Apr 22 18:48:32.484632 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484383 2562 scope.go:117] "RemoveContainer" containerID="2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" Apr 22 18:48:32.484632 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484417 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:48:32.484632 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484503 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-t6bwc\"" Apr 22 18:48:32.485535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484705 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:48:32.485535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484731 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5"} err="failed to get container status \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": rpc error: code = NotFound desc = could not find container \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": container with ID starting with 2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5 not found: ID does not exist" Apr 22 18:48:32.485535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484755 2562 scope.go:117] "RemoveContainer" containerID="ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" Apr 22 18:48:32.485535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484758 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:48:32.485535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.484837 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:48:32.485535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.485221 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:48:32.485535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.485221 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:48:32.485535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.485406 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105"} err="failed to get container status \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": rpc error: code = NotFound desc = could not find container \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": container with ID starting with ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105 not found: ID does not exist" Apr 22 18:48:32.485535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.485428 2562 scope.go:117] "RemoveContainer" containerID="9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" Apr 22 18:48:32.486840 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.485587 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:48:32.486840 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.485634 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:48:32.486840 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.485944 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103"} err="failed to get container status \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": rpc error: code = NotFound desc = could not find container \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": container with ID starting with 9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103 not found: ID does not exist" Apr 22 18:48:32.486840 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.485983 2562 scope.go:117] "RemoveContainer" containerID="ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" Apr 22 18:48:32.486840 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.486746 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a"} err="failed to get container status \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": rpc error: code = NotFound desc = could not find container \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": container with ID starting with ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a not found: ID does not exist" Apr 22 18:48:32.486840 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.486769 2562 scope.go:117] "RemoveContainer" containerID="6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" Apr 22 18:48:32.487193 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.487077 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104"} err="failed to get container status \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": rpc error: code = NotFound desc = could not find container \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": container with ID starting with 6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104 not found: ID does not exist" Apr 22 18:48:32.487193 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.487099 2562 scope.go:117] "RemoveContainer" containerID="87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" Apr 22 18:48:32.487441 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.487391 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933"} err="failed to get container status \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": rpc error: code = NotFound desc = could not find container \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": container with ID starting with 87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933 not found: ID does not exist" Apr 22 18:48:32.487501 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.487444 2562 scope.go:117] "RemoveContainer" containerID="6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5" Apr 22 18:48:32.487614 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.487595 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:48:32.487817 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.487792 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5"} err="failed to get container status \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": rpc error: code = NotFound desc = could not find container \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": container with ID starting with 6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5 not found: ID does not exist" Apr 22 18:48:32.487887 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.487817 2562 scope.go:117] "RemoveContainer" containerID="2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5" Apr 22 18:48:32.488236 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.488205 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5"} err="failed to get container status \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": rpc error: code = NotFound desc = could not find container \"2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5\": container with ID starting with 2fe07d3026d67e58b90ef6a5727b34747aa657b905b60f6fceea93760fb19ed5 not found: ID does not exist" Apr 22 18:48:32.488236 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.488235 2562 scope.go:117] "RemoveContainer" containerID="ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105" Apr 22 18:48:32.488540 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.488509 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105"} err="failed to get container status \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": rpc error: code = NotFound desc = could not find container \"ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105\": container with ID starting with ec0f987ee5581d637e746eadca1dbae117d2fa57de97d34ee54b10ad09ab4105 not found: ID does not exist" Apr 22 18:48:32.488540 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.488535 2562 scope.go:117] "RemoveContainer" containerID="9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103" Apr 22 18:48:32.488827 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.488789 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103"} err="failed to get container status \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": rpc error: code = NotFound desc = could not find container \"9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103\": container with ID starting with 9c1868bfc17e33c801aedce6e3aeab74910489a1ff4e4091a971844cce47b103 not found: ID does not exist" Apr 22 18:48:32.488827 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.488817 2562 scope.go:117] "RemoveContainer" containerID="ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a" Apr 22 18:48:32.489348 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.489116 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a"} err="failed to get container status \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": rpc error: code = NotFound desc = could not find container \"ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a\": container with ID starting with ce25be86b047c0baf4a8d6ebd5b8a948690c49a0207cef14e646b6d22e618b6a not found: ID does not exist" Apr 22 18:48:32.489348 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.489155 2562 scope.go:117] "RemoveContainer" containerID="6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104" Apr 22 18:48:32.489629 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.489396 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104"} err="failed to get container status \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": rpc error: code = NotFound desc = could not find container \"6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104\": container with ID starting with 6d6669a459773ec514ceb19e7641becce328571458ba58e7a7633945a3614104 not found: ID does not exist" Apr 22 18:48:32.489629 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.489420 2562 scope.go:117] "RemoveContainer" containerID="87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933" Apr 22 18:48:32.489794 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.489664 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933"} err="failed to get container status \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": rpc error: code = NotFound desc = could not find container \"87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933\": container with ID starting with 87cdf0894e18bb835f2e29c3a2ff9b254c8819f90ebe177c75e6aa693d534933 not found: ID does not exist" Apr 22 18:48:32.489794 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.489689 2562 scope.go:117] "RemoveContainer" containerID="6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5" Apr 22 18:48:32.490063 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.489998 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5"} err="failed to get container status \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": rpc error: code = NotFound desc = could not find container \"6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5\": container with ID starting with 6a67f49040d975debfaa0ea356ee99e83b82aa248e3a608f99f6f384ce62f3a5 not found: ID does not exist" Apr 22 18:48:32.490760 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.490731 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:48:32.493792 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.493759 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:48:32.536352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536293 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536352 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536322 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536504 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536378 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536504 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536398 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536504 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536427 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f873d8da-279d-4a65-bdef-a40c7f6c6f93-config-out\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536659 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536574 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-config\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536659 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536630 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536748 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536682 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f873d8da-279d-4a65-bdef-a40c7f6c6f93-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536793 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536748 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536793 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536765 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.536793 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536779 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f873d8da-279d-4a65-bdef-a40c7f6c6f93-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.537007 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536803 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.537007 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536884 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.537007 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536915 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qwpj\" (UniqueName: \"kubernetes.io/projected/f873d8da-279d-4a65-bdef-a40c7f6c6f93-kube-api-access-4qwpj\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.537007 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536941 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-web-config\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.537007 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.536965 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.537198 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.537008 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.537198 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.537058 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637607 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637579 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-config\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637788 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637612 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637788 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637635 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f873d8da-279d-4a65-bdef-a40c7f6c6f93-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637788 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637750 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637788 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637775 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637964 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637794 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f873d8da-279d-4a65-bdef-a40c7f6c6f93-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637964 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637811 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637964 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637839 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637964 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637865 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qwpj\" (UniqueName: \"kubernetes.io/projected/f873d8da-279d-4a65-bdef-a40c7f6c6f93-kube-api-access-4qwpj\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637964 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637893 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-web-config\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.637964 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637927 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.638287 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.637954 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.638287 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.638008 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.638287 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.638096 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.638287 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.638126 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.638287 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.638204 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.638287 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.638209 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f873d8da-279d-4a65-bdef-a40c7f6c6f93-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.638287 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.638231 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.638287 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.638284 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f873d8da-279d-4a65-bdef-a40c7f6c6f93-config-out\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.641223 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.640710 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f873d8da-279d-4a65-bdef-a40c7f6c6f93-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.641223 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.640890 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.641223 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.640890 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.641223 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.641077 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f873d8da-279d-4a65-bdef-a40c7f6c6f93-config-out\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.641485 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.641278 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-config\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.641535 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.641496 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.646009 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.645101 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.647128 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.647104 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.647128 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.647121 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.647877 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.647826 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.649612 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.648907 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f873d8da-279d-4a65-bdef-a40c7f6c6f93-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.649612 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.649454 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.649612 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.649504 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.650298 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.650275 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-web-config\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.650660 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.650638 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.650754 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.650742 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f873d8da-279d-4a65-bdef-a40c7f6c6f93-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.656220 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.656202 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qwpj\" (UniqueName: \"kubernetes.io/projected/f873d8da-279d-4a65-bdef-a40c7f6c6f93-kube-api-access-4qwpj\") pod \"prometheus-k8s-0\" (UID: \"f873d8da-279d-4a65-bdef-a40c7f6c6f93\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.795338 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.795269 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:32.842622 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.842589 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="365d91ad-6d91-4995-b4f3-ff77c31fa9ab" path="/var/lib/kubelet/pods/365d91ad-6d91-4995-b4f3-ff77c31fa9ab/volumes" Apr 22 18:48:32.925408 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:32.925286 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:48:32.927703 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:48:32.927672 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf873d8da_279d_4a65_bdef_a40c7f6c6f93.slice/crio-e1b1bc8d6014f177e44b16d591df0370a723180c4f678a513aa2658f22cada2c WatchSource:0}: Error finding container e1b1bc8d6014f177e44b16d591df0370a723180c4f678a513aa2658f22cada2c: Status 404 returned error can't find the container with id e1b1bc8d6014f177e44b16d591df0370a723180c4f678a513aa2658f22cada2c Apr 22 18:48:33.423062 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:33.423027 2562 generic.go:358] "Generic (PLEG): container finished" podID="f873d8da-279d-4a65-bdef-a40c7f6c6f93" containerID="29a3044d671431f45c5323a4e33a15ba67bfb32180440ee9d1b0df24252e73cd" exitCode=0 Apr 22 18:48:33.423481 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:33.423115 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f873d8da-279d-4a65-bdef-a40c7f6c6f93","Type":"ContainerDied","Data":"29a3044d671431f45c5323a4e33a15ba67bfb32180440ee9d1b0df24252e73cd"} Apr 22 18:48:33.423481 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:33.423160 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f873d8da-279d-4a65-bdef-a40c7f6c6f93","Type":"ContainerStarted","Data":"e1b1bc8d6014f177e44b16d591df0370a723180c4f678a513aa2658f22cada2c"} Apr 22 18:48:34.429772 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.429682 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f873d8da-279d-4a65-bdef-a40c7f6c6f93","Type":"ContainerStarted","Data":"0bf31534c37e607c7e916565e4be8d5efcbf7044b02e6e6357f4e7b48e0e367d"} Apr 22 18:48:34.429772 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.429722 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f873d8da-279d-4a65-bdef-a40c7f6c6f93","Type":"ContainerStarted","Data":"607a0cb8accd2ed77f2b818808d867f4f32269d298cccdccca1f52c76bd4a486"} Apr 22 18:48:34.429772 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.429737 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f873d8da-279d-4a65-bdef-a40c7f6c6f93","Type":"ContainerStarted","Data":"318cdf0f9be6c803aa18eb444c59858c36da0f69ac219f3ed319764f8e7803e7"} Apr 22 18:48:34.429772 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.429748 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f873d8da-279d-4a65-bdef-a40c7f6c6f93","Type":"ContainerStarted","Data":"3e82a76bb2a274f01421b1ff69e3589b32d462d326f8993af6fb588fb18083ec"} Apr 22 18:48:34.429772 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.429757 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f873d8da-279d-4a65-bdef-a40c7f6c6f93","Type":"ContainerStarted","Data":"254c9c1c8e2da8d43c2bad10662abcec81bb31c263b0257a2093139055368f41"} Apr 22 18:48:34.429772 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.429765 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f873d8da-279d-4a65-bdef-a40c7f6c6f93","Type":"ContainerStarted","Data":"d559aed5cda4622a978a94d60b7cceddc06d0bca9fc0c3da4fd0c052ef1978aa"} Apr 22 18:48:34.431424 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.431403 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" event={"ID":"d922fb92-79c5-4a55-a2b5-321fe55b5381","Type":"ContainerStarted","Data":"4a8156c73393d876c90d45fdce5ca57424d92d49621525eba8943582f4690d2a"} Apr 22 18:48:34.431499 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.431430 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" event={"ID":"d922fb92-79c5-4a55-a2b5-321fe55b5381","Type":"ContainerStarted","Data":"2ab318de4a5a30e89279c30620acc6a6b219e4b4ebfba33b3f3dde818785f3a9"} Apr 22 18:48:34.431499 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.431439 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" event={"ID":"d922fb92-79c5-4a55-a2b5-321fe55b5381","Type":"ContainerStarted","Data":"67524847c0d1085d7632f7cf903e7cd6d14963421096448485b44c1bec42b766"} Apr 22 18:48:34.457051 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.456986 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.456969351 podStartE2EDuration="2.456969351s" podCreationTimestamp="2026-04-22 18:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:34.455056511 +0000 UTC m=+126.134282802" watchObservedRunningTime="2026-04-22 18:48:34.456969351 +0000 UTC m=+126.136195646" Apr 22 18:48:34.478326 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:34.478283 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-76cf6dcdcf-2tgsb" podStartSLOduration=1.320857698 podStartE2EDuration="3.478271105s" podCreationTimestamp="2026-04-22 18:48:31 +0000 UTC" firstStartedPulling="2026-04-22 18:48:31.684382392 +0000 UTC m=+123.363608662" lastFinishedPulling="2026-04-22 18:48:33.841795782 +0000 UTC m=+125.521022069" observedRunningTime="2026-04-22 18:48:34.477060477 +0000 UTC m=+126.156286770" watchObservedRunningTime="2026-04-22 18:48:34.478271105 +0000 UTC m=+126.157497397" Apr 22 18:48:37.795632 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:48:37.795596 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:32.795658 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:49:32.795625 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:32.810873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:49:32.810851 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:33.615751 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:49:33.615724 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:22.208957 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.208924 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-r5vf6"] Apr 22 18:50:22.212044 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.212008 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.214540 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.214513 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:50:22.220396 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.220372 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r5vf6"] Apr 22 18:50:22.363743 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.363708 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f2bae4e9-5ac7-46d8-beee-9c9a3a81031b-original-pull-secret\") pod \"global-pull-secret-syncer-r5vf6\" (UID: \"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b\") " pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.363935 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.363761 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f2bae4e9-5ac7-46d8-beee-9c9a3a81031b-dbus\") pod \"global-pull-secret-syncer-r5vf6\" (UID: \"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b\") " pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.363935 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.363863 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f2bae4e9-5ac7-46d8-beee-9c9a3a81031b-kubelet-config\") pod \"global-pull-secret-syncer-r5vf6\" (UID: \"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b\") " pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.464828 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.464737 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f2bae4e9-5ac7-46d8-beee-9c9a3a81031b-dbus\") pod \"global-pull-secret-syncer-r5vf6\" (UID: \"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b\") " pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.464958 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.464839 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f2bae4e9-5ac7-46d8-beee-9c9a3a81031b-kubelet-config\") pod \"global-pull-secret-syncer-r5vf6\" (UID: \"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b\") " pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.464958 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.464878 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f2bae4e9-5ac7-46d8-beee-9c9a3a81031b-original-pull-secret\") pod \"global-pull-secret-syncer-r5vf6\" (UID: \"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b\") " pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.464958 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.464931 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f2bae4e9-5ac7-46d8-beee-9c9a3a81031b-dbus\") pod \"global-pull-secret-syncer-r5vf6\" (UID: \"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b\") " pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.464958 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.464951 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f2bae4e9-5ac7-46d8-beee-9c9a3a81031b-kubelet-config\") pod \"global-pull-secret-syncer-r5vf6\" (UID: \"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b\") " pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.467075 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.467060 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f2bae4e9-5ac7-46d8-beee-9c9a3a81031b-original-pull-secret\") pod \"global-pull-secret-syncer-r5vf6\" (UID: \"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b\") " pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.521646 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.521618 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r5vf6" Apr 22 18:50:22.635071 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.634945 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r5vf6"] Apr 22 18:50:22.637339 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:50:22.637307 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2bae4e9_5ac7_46d8_beee_9c9a3a81031b.slice/crio-230cdd95869b1d355b18e187ba8a8694e181f97ca37524d487d3fcc6f138d7f3 WatchSource:0}: Error finding container 230cdd95869b1d355b18e187ba8a8694e181f97ca37524d487d3fcc6f138d7f3: Status 404 returned error can't find the container with id 230cdd95869b1d355b18e187ba8a8694e181f97ca37524d487d3fcc6f138d7f3 Apr 22 18:50:22.735079 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:22.734983 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r5vf6" event={"ID":"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b","Type":"ContainerStarted","Data":"230cdd95869b1d355b18e187ba8a8694e181f97ca37524d487d3fcc6f138d7f3"} Apr 22 18:50:26.747885 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:26.747851 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r5vf6" event={"ID":"f2bae4e9-5ac7-46d8-beee-9c9a3a81031b","Type":"ContainerStarted","Data":"ee570a815146d4633700a4c09f53c00ab78111dcea0bf0a89f8389cb6b36c8e9"} Apr 22 18:50:26.762752 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:50:26.762698 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-r5vf6" podStartSLOduration=0.95973753 podStartE2EDuration="4.762685194s" podCreationTimestamp="2026-04-22 18:50:22 +0000 UTC" firstStartedPulling="2026-04-22 18:50:22.638828808 +0000 UTC m=+234.318055079" lastFinishedPulling="2026-04-22 18:50:26.441776459 +0000 UTC m=+238.121002743" observedRunningTime="2026-04-22 18:50:26.762419106 +0000 UTC m=+238.441645402" watchObservedRunningTime="2026-04-22 18:50:26.762685194 +0000 UTC m=+238.441911486" Apr 22 18:51:28.763618 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:51:28.763594 2562 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:52:14.286248 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.286218 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-cqwz5"] Apr 22 18:52:14.289207 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.289190 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:52:14.291831 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.291805 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-qkpkq\"" Apr 22 18:52:14.291965 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.291835 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:52:14.291965 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.291810 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:52:14.292862 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.292845 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:52:14.298562 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.298541 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-cqwz5"] Apr 22 18:52:14.431373 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.431337 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a4b4aca4-a42d-4da5-bf19-0e0dcca4d385-data\") pod \"seaweedfs-86cc847c5c-cqwz5\" (UID: \"a4b4aca4-a42d-4da5-bf19-0e0dcca4d385\") " pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:52:14.431545 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.431382 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8pp7\" (UniqueName: \"kubernetes.io/projected/a4b4aca4-a42d-4da5-bf19-0e0dcca4d385-kube-api-access-q8pp7\") pod \"seaweedfs-86cc847c5c-cqwz5\" (UID: \"a4b4aca4-a42d-4da5-bf19-0e0dcca4d385\") " pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:52:14.532249 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.532217 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a4b4aca4-a42d-4da5-bf19-0e0dcca4d385-data\") pod \"seaweedfs-86cc847c5c-cqwz5\" (UID: \"a4b4aca4-a42d-4da5-bf19-0e0dcca4d385\") " pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:52:14.532396 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.532271 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8pp7\" (UniqueName: \"kubernetes.io/projected/a4b4aca4-a42d-4da5-bf19-0e0dcca4d385-kube-api-access-q8pp7\") pod \"seaweedfs-86cc847c5c-cqwz5\" (UID: \"a4b4aca4-a42d-4da5-bf19-0e0dcca4d385\") " pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:52:14.532570 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.532552 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/a4b4aca4-a42d-4da5-bf19-0e0dcca4d385-data\") pod \"seaweedfs-86cc847c5c-cqwz5\" (UID: \"a4b4aca4-a42d-4da5-bf19-0e0dcca4d385\") " pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:52:14.541783 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.541721 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8pp7\" (UniqueName: \"kubernetes.io/projected/a4b4aca4-a42d-4da5-bf19-0e0dcca4d385-kube-api-access-q8pp7\") pod \"seaweedfs-86cc847c5c-cqwz5\" (UID: \"a4b4aca4-a42d-4da5-bf19-0e0dcca4d385\") " pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:52:14.599123 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.599101 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:52:14.713445 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.713417 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-cqwz5"] Apr 22 18:52:14.717108 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:52:14.717078 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4b4aca4_a42d_4da5_bf19_0e0dcca4d385.slice/crio-f21b5928110e16a9eb847c53b85840d40ba13cfa80b3e883177c77b4dc464f3d WatchSource:0}: Error finding container f21b5928110e16a9eb847c53b85840d40ba13cfa80b3e883177c77b4dc464f3d: Status 404 returned error can't find the container with id f21b5928110e16a9eb847c53b85840d40ba13cfa80b3e883177c77b4dc464f3d Apr 22 18:52:14.718232 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:14.718217 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:52:15.054459 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:15.054417 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-cqwz5" event={"ID":"a4b4aca4-a42d-4da5-bf19-0e0dcca4d385","Type":"ContainerStarted","Data":"f21b5928110e16a9eb847c53b85840d40ba13cfa80b3e883177c77b4dc464f3d"} Apr 22 18:52:18.066902 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:18.066872 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-cqwz5" event={"ID":"a4b4aca4-a42d-4da5-bf19-0e0dcca4d385","Type":"ContainerStarted","Data":"65012773247e1e1078f3a1d2de8dfef6c847f39199d7883c192cba2ba1f6c6f9"} Apr 22 18:52:18.067318 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:18.067132 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:52:18.083357 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:18.083318 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-cqwz5" podStartSLOduration=1.265329199 podStartE2EDuration="4.083305861s" podCreationTimestamp="2026-04-22 18:52:14 +0000 UTC" firstStartedPulling="2026-04-22 18:52:14.718344482 +0000 UTC m=+346.397570752" lastFinishedPulling="2026-04-22 18:52:17.536321144 +0000 UTC m=+349.215547414" observedRunningTime="2026-04-22 18:52:18.081548735 +0000 UTC m=+349.760775037" watchObservedRunningTime="2026-04-22 18:52:18.083305861 +0000 UTC m=+349.762532153" Apr 22 18:52:24.074783 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:52:24.074747 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-cqwz5" Apr 22 18:53:24.345130 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.345102 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-swv5t"] Apr 22 18:53:24.347989 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.347974 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:24.350547 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.350529 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:53:24.350650 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.350600 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-4kj8k\"" Apr 22 18:53:24.357971 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.357949 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-swv5t"] Apr 22 18:53:24.468005 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.467976 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxv4j\" (UniqueName: \"kubernetes.io/projected/454f1685-874d-406c-8bfc-5d7de31631b7-kube-api-access-jxv4j\") pod \"odh-model-controller-696fc77849-swv5t\" (UID: \"454f1685-874d-406c-8bfc-5d7de31631b7\") " pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:24.468140 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.468037 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/454f1685-874d-406c-8bfc-5d7de31631b7-cert\") pod \"odh-model-controller-696fc77849-swv5t\" (UID: \"454f1685-874d-406c-8bfc-5d7de31631b7\") " pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:24.569324 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.569295 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxv4j\" (UniqueName: \"kubernetes.io/projected/454f1685-874d-406c-8bfc-5d7de31631b7-kube-api-access-jxv4j\") pod \"odh-model-controller-696fc77849-swv5t\" (UID: \"454f1685-874d-406c-8bfc-5d7de31631b7\") " pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:24.569442 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.569335 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/454f1685-874d-406c-8bfc-5d7de31631b7-cert\") pod \"odh-model-controller-696fc77849-swv5t\" (UID: \"454f1685-874d-406c-8bfc-5d7de31631b7\") " pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:24.569495 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:53:24.569444 2562 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:53:24.569531 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:53:24.569506 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454f1685-874d-406c-8bfc-5d7de31631b7-cert podName:454f1685-874d-406c-8bfc-5d7de31631b7 nodeName:}" failed. No retries permitted until 2026-04-22 18:53:25.06948903 +0000 UTC m=+416.748715300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/454f1685-874d-406c-8bfc-5d7de31631b7-cert") pod "odh-model-controller-696fc77849-swv5t" (UID: "454f1685-874d-406c-8bfc-5d7de31631b7") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:53:24.579931 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:24.579903 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxv4j\" (UniqueName: \"kubernetes.io/projected/454f1685-874d-406c-8bfc-5d7de31631b7-kube-api-access-jxv4j\") pod \"odh-model-controller-696fc77849-swv5t\" (UID: \"454f1685-874d-406c-8bfc-5d7de31631b7\") " pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:25.074007 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:25.073963 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/454f1685-874d-406c-8bfc-5d7de31631b7-cert\") pod \"odh-model-controller-696fc77849-swv5t\" (UID: \"454f1685-874d-406c-8bfc-5d7de31631b7\") " pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:25.074190 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:53:25.074108 2562 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:53:25.074190 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:53:25.074183 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/454f1685-874d-406c-8bfc-5d7de31631b7-cert podName:454f1685-874d-406c-8bfc-5d7de31631b7 nodeName:}" failed. No retries permitted until 2026-04-22 18:53:26.074163076 +0000 UTC m=+417.753389349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/454f1685-874d-406c-8bfc-5d7de31631b7-cert") pod "odh-model-controller-696fc77849-swv5t" (UID: "454f1685-874d-406c-8bfc-5d7de31631b7") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:53:26.081337 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:26.081301 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/454f1685-874d-406c-8bfc-5d7de31631b7-cert\") pod \"odh-model-controller-696fc77849-swv5t\" (UID: \"454f1685-874d-406c-8bfc-5d7de31631b7\") " pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:26.083528 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:26.083509 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/454f1685-874d-406c-8bfc-5d7de31631b7-cert\") pod \"odh-model-controller-696fc77849-swv5t\" (UID: \"454f1685-874d-406c-8bfc-5d7de31631b7\") " pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:26.158154 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:26.158128 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:26.275663 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:26.275639 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-swv5t"] Apr 22 18:53:26.278055 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:53:26.278025 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod454f1685_874d_406c_8bfc_5d7de31631b7.slice/crio-aeeab607e0a4de7f3125ab25f2fb3765c433cffb445ed9dbf992701d578a5bc6 WatchSource:0}: Error finding container aeeab607e0a4de7f3125ab25f2fb3765c433cffb445ed9dbf992701d578a5bc6: Status 404 returned error can't find the container with id aeeab607e0a4de7f3125ab25f2fb3765c433cffb445ed9dbf992701d578a5bc6 Apr 22 18:53:27.268865 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:27.268826 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-swv5t" event={"ID":"454f1685-874d-406c-8bfc-5d7de31631b7","Type":"ContainerStarted","Data":"aeeab607e0a4de7f3125ab25f2fb3765c433cffb445ed9dbf992701d578a5bc6"} Apr 22 18:53:29.275846 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:29.275753 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-swv5t" event={"ID":"454f1685-874d-406c-8bfc-5d7de31631b7","Type":"ContainerStarted","Data":"b35e3ca8c2a8615dbece904f32e4d6d9ba6d4fd1414a0606712ef5fbdb8e9fab"} Apr 22 18:53:29.276219 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:29.275886 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:29.292246 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:29.292202 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-swv5t" podStartSLOduration=2.613755381 podStartE2EDuration="5.292188228s" podCreationTimestamp="2026-04-22 18:53:24 +0000 UTC" firstStartedPulling="2026-04-22 18:53:26.279141135 +0000 UTC m=+417.958367409" lastFinishedPulling="2026-04-22 18:53:28.957573986 +0000 UTC m=+420.636800256" observedRunningTime="2026-04-22 18:53:29.291147883 +0000 UTC m=+420.970374175" watchObservedRunningTime="2026-04-22 18:53:29.292188228 +0000 UTC m=+420.971414519" Apr 22 18:53:40.281437 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:40.281365 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-swv5t" Apr 22 18:53:41.075525 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:41.075491 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-lwt56"] Apr 22 18:53:41.078572 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:41.078556 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lwt56" Apr 22 18:53:41.085511 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:41.085479 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-lwt56"] Apr 22 18:53:41.102795 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:41.102773 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95m52\" (UniqueName: \"kubernetes.io/projected/a510694b-77d7-467b-99d3-07b8d4fd4d86-kube-api-access-95m52\") pod \"s3-init-lwt56\" (UID: \"a510694b-77d7-467b-99d3-07b8d4fd4d86\") " pod="kserve/s3-init-lwt56" Apr 22 18:53:41.203470 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:41.203443 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95m52\" (UniqueName: \"kubernetes.io/projected/a510694b-77d7-467b-99d3-07b8d4fd4d86-kube-api-access-95m52\") pod \"s3-init-lwt56\" (UID: \"a510694b-77d7-467b-99d3-07b8d4fd4d86\") " pod="kserve/s3-init-lwt56" Apr 22 18:53:41.212572 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:41.212542 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95m52\" (UniqueName: \"kubernetes.io/projected/a510694b-77d7-467b-99d3-07b8d4fd4d86-kube-api-access-95m52\") pod \"s3-init-lwt56\" (UID: \"a510694b-77d7-467b-99d3-07b8d4fd4d86\") " pod="kserve/s3-init-lwt56" Apr 22 18:53:41.403677 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:41.403588 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lwt56" Apr 22 18:53:41.517586 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:41.517559 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-lwt56"] Apr 22 18:53:41.520486 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:53:41.520452 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda510694b_77d7_467b_99d3_07b8d4fd4d86.slice/crio-1d7951d1223e0444cf633e9d337072de903900df11846da840c0fec04b5b9806 WatchSource:0}: Error finding container 1d7951d1223e0444cf633e9d337072de903900df11846da840c0fec04b5b9806: Status 404 returned error can't find the container with id 1d7951d1223e0444cf633e9d337072de903900df11846da840c0fec04b5b9806 Apr 22 18:53:42.319399 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:42.319357 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lwt56" event={"ID":"a510694b-77d7-467b-99d3-07b8d4fd4d86","Type":"ContainerStarted","Data":"1d7951d1223e0444cf633e9d337072de903900df11846da840c0fec04b5b9806"} Apr 22 18:53:46.335121 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:46.335089 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lwt56" event={"ID":"a510694b-77d7-467b-99d3-07b8d4fd4d86","Type":"ContainerStarted","Data":"3194d2904678b08d45754c236e9a47988b153df2659487ab7b8cea0187f7a542"} Apr 22 18:53:46.349708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:46.349659 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-lwt56" podStartSLOduration=0.987127416 podStartE2EDuration="5.349644609s" podCreationTimestamp="2026-04-22 18:53:41 +0000 UTC" firstStartedPulling="2026-04-22 18:53:41.522361997 +0000 UTC m=+433.201588267" lastFinishedPulling="2026-04-22 18:53:45.884879191 +0000 UTC m=+437.564105460" observedRunningTime="2026-04-22 18:53:46.348944537 +0000 UTC m=+438.028170842" watchObservedRunningTime="2026-04-22 18:53:46.349644609 +0000 UTC m=+438.028870901" Apr 22 18:53:49.345607 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:49.345567 2562 generic.go:358] "Generic (PLEG): container finished" podID="a510694b-77d7-467b-99d3-07b8d4fd4d86" containerID="3194d2904678b08d45754c236e9a47988b153df2659487ab7b8cea0187f7a542" exitCode=0 Apr 22 18:53:49.345979 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:49.345622 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lwt56" event={"ID":"a510694b-77d7-467b-99d3-07b8d4fd4d86","Type":"ContainerDied","Data":"3194d2904678b08d45754c236e9a47988b153df2659487ab7b8cea0187f7a542"} Apr 22 18:53:50.467855 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:50.467831 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lwt56" Apr 22 18:53:50.596776 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:50.596746 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95m52\" (UniqueName: \"kubernetes.io/projected/a510694b-77d7-467b-99d3-07b8d4fd4d86-kube-api-access-95m52\") pod \"a510694b-77d7-467b-99d3-07b8d4fd4d86\" (UID: \"a510694b-77d7-467b-99d3-07b8d4fd4d86\") " Apr 22 18:53:50.598894 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:50.598867 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a510694b-77d7-467b-99d3-07b8d4fd4d86-kube-api-access-95m52" (OuterVolumeSpecName: "kube-api-access-95m52") pod "a510694b-77d7-467b-99d3-07b8d4fd4d86" (UID: "a510694b-77d7-467b-99d3-07b8d4fd4d86"). InnerVolumeSpecName "kube-api-access-95m52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:50.697454 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:50.697432 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-95m52\" (UniqueName: \"kubernetes.io/projected/a510694b-77d7-467b-99d3-07b8d4fd4d86-kube-api-access-95m52\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:53:51.352442 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:51.352410 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-lwt56" Apr 22 18:53:51.352442 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:51.352420 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-lwt56" event={"ID":"a510694b-77d7-467b-99d3-07b8d4fd4d86","Type":"ContainerDied","Data":"1d7951d1223e0444cf633e9d337072de903900df11846da840c0fec04b5b9806"} Apr 22 18:53:51.352442 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:53:51.352448 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d7951d1223e0444cf633e9d337072de903900df11846da840c0fec04b5b9806" Apr 22 18:54:00.365132 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.365096 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh"] Apr 22 18:54:00.365506 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.365434 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a510694b-77d7-467b-99d3-07b8d4fd4d86" containerName="s3-init" Apr 22 18:54:00.365506 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.365445 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="a510694b-77d7-467b-99d3-07b8d4fd4d86" containerName="s3-init" Apr 22 18:54:00.365573 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.365521 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="a510694b-77d7-467b-99d3-07b8d4fd4d86" containerName="s3-init" Apr 22 18:54:00.368909 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.368886 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.371334 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.371307 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\"" Apr 22 18:54:00.371454 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.371337 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-raw-sklearn-batcher-5c98d-predictor-serving-cert\"" Apr 22 18:54:00.371454 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.371350 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:54:00.371454 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.371361 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-j46rb\"" Apr 22 18:54:00.371454 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.371390 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:54:00.379526 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.379499 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh"] Apr 22 18:54:00.472249 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.472204 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tlhz\" (UniqueName: \"kubernetes.io/projected/9708e338-c6ac-4415-8a67-1c77d5f09062-kube-api-access-9tlhz\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.472447 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.472257 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9708e338-c6ac-4415-8a67-1c77d5f09062-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.472447 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.472391 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9708e338-c6ac-4415-8a67-1c77d5f09062-proxy-tls\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.472447 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.472430 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9708e338-c6ac-4415-8a67-1c77d5f09062-isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.573382 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.573335 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9708e338-c6ac-4415-8a67-1c77d5f09062-proxy-tls\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.573588 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.573387 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9708e338-c6ac-4415-8a67-1c77d5f09062-isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.573588 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.573446 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tlhz\" (UniqueName: \"kubernetes.io/projected/9708e338-c6ac-4415-8a67-1c77d5f09062-kube-api-access-9tlhz\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.573588 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.573474 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9708e338-c6ac-4415-8a67-1c77d5f09062-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.573588 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:54:00.573545 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-serving-cert: secret "isvc-raw-sklearn-batcher-5c98d-predictor-serving-cert" not found Apr 22 18:54:00.573836 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:54:00.573629 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9708e338-c6ac-4415-8a67-1c77d5f09062-proxy-tls podName:9708e338-c6ac-4415-8a67-1c77d5f09062 nodeName:}" failed. No retries permitted until 2026-04-22 18:54:01.073606861 +0000 UTC m=+452.752833153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9708e338-c6ac-4415-8a67-1c77d5f09062-proxy-tls") pod "isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" (UID: "9708e338-c6ac-4415-8a67-1c77d5f09062") : secret "isvc-raw-sklearn-batcher-5c98d-predictor-serving-cert" not found Apr 22 18:54:00.573931 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.573909 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9708e338-c6ac-4415-8a67-1c77d5f09062-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.574140 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.574121 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9708e338-c6ac-4415-8a67-1c77d5f09062-isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:00.585008 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:00.584969 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tlhz\" (UniqueName: \"kubernetes.io/projected/9708e338-c6ac-4415-8a67-1c77d5f09062-kube-api-access-9tlhz\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:01.078275 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:01.078244 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9708e338-c6ac-4415-8a67-1c77d5f09062-proxy-tls\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:01.080647 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:01.080625 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9708e338-c6ac-4415-8a67-1c77d5f09062-proxy-tls\") pod \"isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:01.279777 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:01.279747 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:01.401835 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:01.401779 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh"] Apr 22 18:54:01.404233 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:54:01.404206 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9708e338_c6ac_4415_8a67_1c77d5f09062.slice/crio-bf3b3dd77a2655da682f2865db88565a1b124745491e1c6a836ad81f8e7b4233 WatchSource:0}: Error finding container bf3b3dd77a2655da682f2865db88565a1b124745491e1c6a836ad81f8e7b4233: Status 404 returned error can't find the container with id bf3b3dd77a2655da682f2865db88565a1b124745491e1c6a836ad81f8e7b4233 Apr 22 18:54:02.391383 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:02.391333 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerStarted","Data":"bf3b3dd77a2655da682f2865db88565a1b124745491e1c6a836ad81f8e7b4233"} Apr 22 18:54:06.405251 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:06.405216 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerStarted","Data":"43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee"} Apr 22 18:54:09.414628 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:09.414600 2562 generic.go:358] "Generic (PLEG): container finished" podID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerID="43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee" exitCode=0 Apr 22 18:54:09.414915 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:09.414677 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerDied","Data":"43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee"} Apr 22 18:54:23.466599 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:23.466563 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerStarted","Data":"b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f"} Apr 22 18:54:26.478348 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:26.478308 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerStarted","Data":"b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6"} Apr 22 18:54:29.489922 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:29.489888 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerStarted","Data":"d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88"} Apr 22 18:54:29.490358 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:29.490236 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:29.490358 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:29.490254 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:29.491348 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:29.491302 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:54:29.515503 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:29.515456 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podStartSLOduration=2.410697674 podStartE2EDuration="29.515446307s" podCreationTimestamp="2026-04-22 18:54:00 +0000 UTC" firstStartedPulling="2026-04-22 18:54:01.40593743 +0000 UTC m=+453.085163700" lastFinishedPulling="2026-04-22 18:54:28.510686062 +0000 UTC m=+480.189912333" observedRunningTime="2026-04-22 18:54:29.512009236 +0000 UTC m=+481.191235554" watchObservedRunningTime="2026-04-22 18:54:29.515446307 +0000 UTC m=+481.194672596" Apr 22 18:54:30.493141 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:30.493087 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:30.493593 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:30.493238 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:54:30.494052 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:30.494003 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:54:30.496880 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:30.496864 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:54:31.496341 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:31.496298 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:54:31.496721 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:31.496534 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:54:32.498932 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:32.498893 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:54:32.499342 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:32.499261 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:54:42.498929 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:42.498882 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:54:42.499359 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:42.499289 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:54:52.499030 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:52.498964 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:54:52.499524 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:54:52.499501 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:55:02.499311 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:02.499258 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:55:02.499861 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:02.499794 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:55:12.499892 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:12.499789 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:55:12.500367 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:12.500273 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:55:22.499795 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:22.499741 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:55:22.500256 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:22.500232 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:55:32.499762 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:32.499727 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:55:32.500345 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:32.500250 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:55:45.462849 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.462812 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh"] Apr 22 18:55:45.463331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.463189 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" containerID="cri-o://b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f" gracePeriod=30 Apr 22 18:55:45.463331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.463222 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" containerID="cri-o://d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88" gracePeriod=30 Apr 22 18:55:45.463331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.463225 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" containerID="cri-o://b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6" gracePeriod=30 Apr 22 18:55:45.494146 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.494120 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 22 18:55:45.571273 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.571250 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk"] Apr 22 18:55:45.574768 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.574749 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.577262 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.577240 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-1fc88-predictor-serving-cert\"" Apr 22 18:55:45.577364 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.577262 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\"" Apr 22 18:55:45.585843 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.585760 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk"] Apr 22 18:55:45.649694 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.649664 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67"] Apr 22 18:55:45.653299 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.653282 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.655578 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.655554 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-1fc88-predictor-serving-cert\"" Apr 22 18:55:45.655692 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.655576 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\"" Apr 22 18:55:45.662709 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.662689 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67"] Apr 22 18:55:45.667411 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.667392 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.667525 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.667508 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.667589 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.667543 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-proxy-tls\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.667589 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.667570 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2275x\" (UniqueName: \"kubernetes.io/projected/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kube-api-access-2275x\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.717048 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.716966 2562 generic.go:358] "Generic (PLEG): container finished" podID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerID="b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6" exitCode=2 Apr 22 18:55:45.717171 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.717043 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerDied","Data":"b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6"} Apr 22 18:55:45.767926 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.767888 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc202605-7684-450e-8ffe-953017d94011-proxy-tls\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.767926 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.767921 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct75m\" (UniqueName: \"kubernetes.io/projected/dc202605-7684-450e-8ffe-953017d94011-kube-api-access-ct75m\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.768144 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.767938 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc202605-7684-450e-8ffe-953017d94011-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.768144 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.768025 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.768144 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.768113 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc202605-7684-450e-8ffe-953017d94011-isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.768325 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.768149 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.768325 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.768177 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-proxy-tls\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.768325 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.768202 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2275x\" (UniqueName: \"kubernetes.io/projected/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kube-api-access-2275x\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.768479 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:55:45.768313 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-serving-cert: secret "isvc-sklearn-graph-raw-1fc88-predictor-serving-cert" not found Apr 22 18:55:45.768479 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:55:45.768384 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-proxy-tls podName:c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32 nodeName:}" failed. No retries permitted until 2026-04-22 18:55:46.268361413 +0000 UTC m=+557.947587697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-proxy-tls") pod "isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" (UID: "c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32") : secret "isvc-sklearn-graph-raw-1fc88-predictor-serving-cert" not found Apr 22 18:55:45.768479 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.768318 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.768725 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.768706 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.776687 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.776665 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2275x\" (UniqueName: \"kubernetes.io/projected/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kube-api-access-2275x\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:45.869345 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.869318 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc202605-7684-450e-8ffe-953017d94011-isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.869473 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.869382 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc202605-7684-450e-8ffe-953017d94011-proxy-tls\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.869473 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.869410 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ct75m\" (UniqueName: \"kubernetes.io/projected/dc202605-7684-450e-8ffe-953017d94011-kube-api-access-ct75m\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.869473 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.869434 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc202605-7684-450e-8ffe-953017d94011-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.869611 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:55:45.869543 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-serving-cert: secret "isvc-xgboost-graph-raw-1fc88-predictor-serving-cert" not found Apr 22 18:55:45.869651 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:55:45.869625 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc202605-7684-450e-8ffe-953017d94011-proxy-tls podName:dc202605-7684-450e-8ffe-953017d94011 nodeName:}" failed. No retries permitted until 2026-04-22 18:55:46.369605554 +0000 UTC m=+558.048831828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dc202605-7684-450e-8ffe-953017d94011-proxy-tls") pod "isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" (UID: "dc202605-7684-450e-8ffe-953017d94011") : secret "isvc-xgboost-graph-raw-1fc88-predictor-serving-cert" not found Apr 22 18:55:45.869876 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.869859 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc202605-7684-450e-8ffe-953017d94011-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.870060 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.870042 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc202605-7684-450e-8ffe-953017d94011-isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:45.881666 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:45.881646 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct75m\" (UniqueName: \"kubernetes.io/projected/dc202605-7684-450e-8ffe-953017d94011-kube-api-access-ct75m\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:46.273703 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.273669 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-proxy-tls\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:46.275946 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.275923 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-proxy-tls\") pod \"isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:46.374333 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.374306 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc202605-7684-450e-8ffe-953017d94011-proxy-tls\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:46.376547 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.376526 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc202605-7684-450e-8ffe-953017d94011-proxy-tls\") pod \"isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:46.489218 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.489183 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:46.564333 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.564303 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:55:46.611330 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.611206 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk"] Apr 22 18:55:46.616476 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:55:46.616381 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63a00a5_9e6d_47fc_a0f3_5ed88b74dd32.slice/crio-a3c984d069490b42af5cd649fda89632586c49c8279d8df5f5e9e4a607bace43 WatchSource:0}: Error finding container a3c984d069490b42af5cd649fda89632586c49c8279d8df5f5e9e4a607bace43: Status 404 returned error can't find the container with id a3c984d069490b42af5cd649fda89632586c49c8279d8df5f5e9e4a607bace43 Apr 22 18:55:46.686950 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.686872 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67"] Apr 22 18:55:46.691531 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:55:46.690223 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc202605_7684_450e_8ffe_953017d94011.slice/crio-e0f379ff125620f0149b79f267f393105b9721c3f72c6289df8668f6f9e77ae1 WatchSource:0}: Error finding container e0f379ff125620f0149b79f267f393105b9721c3f72c6289df8668f6f9e77ae1: Status 404 returned error can't find the container with id e0f379ff125620f0149b79f267f393105b9721c3f72c6289df8668f6f9e77ae1 Apr 22 18:55:46.721749 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.721728 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" event={"ID":"dc202605-7684-450e-8ffe-953017d94011","Type":"ContainerStarted","Data":"e0f379ff125620f0149b79f267f393105b9721c3f72c6289df8668f6f9e77ae1"} Apr 22 18:55:46.723054 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.723002 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" event={"ID":"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32","Type":"ContainerStarted","Data":"27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2"} Apr 22 18:55:46.723137 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:46.723064 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" event={"ID":"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32","Type":"ContainerStarted","Data":"a3c984d069490b42af5cd649fda89632586c49c8279d8df5f5e9e4a607bace43"} Apr 22 18:55:47.727311 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:47.727272 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" event={"ID":"dc202605-7684-450e-8ffe-953017d94011","Type":"ContainerStarted","Data":"48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b"} Apr 22 18:55:49.735379 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:49.735349 2562 generic.go:358] "Generic (PLEG): container finished" podID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerID="b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f" exitCode=0 Apr 22 18:55:49.735774 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:49.735418 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerDied","Data":"b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f"} Apr 22 18:55:50.494314 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:50.494273 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 22 18:55:50.739973 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:50.739894 2562 generic.go:358] "Generic (PLEG): container finished" podID="dc202605-7684-450e-8ffe-953017d94011" containerID="48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b" exitCode=0 Apr 22 18:55:50.740388 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:50.739971 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" event={"ID":"dc202605-7684-450e-8ffe-953017d94011","Type":"ContainerDied","Data":"48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b"} Apr 22 18:55:50.741388 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:50.741368 2562 generic.go:358] "Generic (PLEG): container finished" podID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerID="27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2" exitCode=0 Apr 22 18:55:50.741466 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:50.741414 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" event={"ID":"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32","Type":"ContainerDied","Data":"27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2"} Apr 22 18:55:51.748826 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:51.748787 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" event={"ID":"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32","Type":"ContainerStarted","Data":"5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340"} Apr 22 18:55:51.749306 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:51.748836 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" event={"ID":"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32","Type":"ContainerStarted","Data":"7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348"} Apr 22 18:55:51.749306 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:51.749168 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:51.768869 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:51.768801 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podStartSLOduration=6.7687811270000005 podStartE2EDuration="6.768781127s" podCreationTimestamp="2026-04-22 18:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:55:51.766524969 +0000 UTC m=+563.445751286" watchObservedRunningTime="2026-04-22 18:55:51.768781127 +0000 UTC m=+563.448007419" Apr 22 18:55:52.499519 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:52.499475 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:55:52.500788 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:52.500757 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:55:52.754008 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:52.753911 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:52.755569 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:52.755537 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 18:55:53.757369 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:53.757317 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 18:55:55.494335 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:55.494292 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 22 18:55:55.494754 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:55.494439 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:55:58.762616 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:58.762574 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:55:58.763230 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:55:58.763203 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 18:56:00.493942 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:00.493901 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 22 18:56:02.499694 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:02.499650 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:02.500368 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:02.500337 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:56:05.494251 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:05.494207 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 22 18:56:08.763723 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:08.763691 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 18:56:09.814347 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:09.814313 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" event={"ID":"dc202605-7684-450e-8ffe-953017d94011","Type":"ContainerStarted","Data":"278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3"} Apr 22 18:56:09.814347 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:09.814351 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" event={"ID":"dc202605-7684-450e-8ffe-953017d94011","Type":"ContainerStarted","Data":"c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f"} Apr 22 18:56:09.814742 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:09.814548 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:56:09.833427 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:09.833380 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podStartSLOduration=6.460529772 podStartE2EDuration="24.833368204s" podCreationTimestamp="2026-04-22 18:55:45 +0000 UTC" firstStartedPulling="2026-04-22 18:55:50.741401248 +0000 UTC m=+562.420627519" lastFinishedPulling="2026-04-22 18:56:09.114239669 +0000 UTC m=+580.793465951" observedRunningTime="2026-04-22 18:56:09.83079269 +0000 UTC m=+581.510018983" watchObservedRunningTime="2026-04-22 18:56:09.833368204 +0000 UTC m=+581.512594560" Apr 22 18:56:10.493426 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:10.493384 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 22 18:56:10.817493 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:10.817408 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:56:10.818691 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:10.818660 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 18:56:11.820516 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:11.820477 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 18:56:12.499402 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:12.499364 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:12.499570 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:12.499515 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:56:12.500609 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:12.500589 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.22:8080: connect: connection refused" Apr 22 18:56:12.500689 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:12.500679 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:56:15.494362 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.494327 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.22:8643/healthz\": dial tcp 10.134.0.22:8643: connect: connection refused" Apr 22 18:56:15.604809 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.604788 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:56:15.627145 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.627123 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9708e338-c6ac-4415-8a67-1c77d5f09062-proxy-tls\") pod \"9708e338-c6ac-4415-8a67-1c77d5f09062\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " Apr 22 18:56:15.627244 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.627185 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9708e338-c6ac-4415-8a67-1c77d5f09062-isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\") pod \"9708e338-c6ac-4415-8a67-1c77d5f09062\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " Apr 22 18:56:15.627244 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.627208 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9708e338-c6ac-4415-8a67-1c77d5f09062-kserve-provision-location\") pod \"9708e338-c6ac-4415-8a67-1c77d5f09062\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " Apr 22 18:56:15.627330 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.627261 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tlhz\" (UniqueName: \"kubernetes.io/projected/9708e338-c6ac-4415-8a67-1c77d5f09062-kube-api-access-9tlhz\") pod \"9708e338-c6ac-4415-8a67-1c77d5f09062\" (UID: \"9708e338-c6ac-4415-8a67-1c77d5f09062\") " Apr 22 18:56:15.627613 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.627586 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9708e338-c6ac-4415-8a67-1c77d5f09062-isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config") pod "9708e338-c6ac-4415-8a67-1c77d5f09062" (UID: "9708e338-c6ac-4415-8a67-1c77d5f09062"). InnerVolumeSpecName "isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:56:15.627738 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.627623 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9708e338-c6ac-4415-8a67-1c77d5f09062-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9708e338-c6ac-4415-8a67-1c77d5f09062" (UID: "9708e338-c6ac-4415-8a67-1c77d5f09062"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:56:15.629727 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.629554 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9708e338-c6ac-4415-8a67-1c77d5f09062-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9708e338-c6ac-4415-8a67-1c77d5f09062" (UID: "9708e338-c6ac-4415-8a67-1c77d5f09062"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:56:15.629727 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.629683 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9708e338-c6ac-4415-8a67-1c77d5f09062-kube-api-access-9tlhz" (OuterVolumeSpecName: "kube-api-access-9tlhz") pod "9708e338-c6ac-4415-8a67-1c77d5f09062" (UID: "9708e338-c6ac-4415-8a67-1c77d5f09062"). InnerVolumeSpecName "kube-api-access-9tlhz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:15.728540 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.728457 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9tlhz\" (UniqueName: \"kubernetes.io/projected/9708e338-c6ac-4415-8a67-1c77d5f09062-kube-api-access-9tlhz\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:56:15.728540 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.728498 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9708e338-c6ac-4415-8a67-1c77d5f09062-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:56:15.728540 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.728512 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9708e338-c6ac-4415-8a67-1c77d5f09062-isvc-raw-sklearn-batcher-5c98d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:56:15.728540 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.728527 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9708e338-c6ac-4415-8a67-1c77d5f09062-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:56:15.833702 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.833673 2562 generic.go:358] "Generic (PLEG): container finished" podID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerID="d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88" exitCode=0 Apr 22 18:56:15.833856 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.833763 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerDied","Data":"d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88"} Apr 22 18:56:15.833856 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.833806 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" event={"ID":"9708e338-c6ac-4415-8a67-1c77d5f09062","Type":"ContainerDied","Data":"bf3b3dd77a2655da682f2865db88565a1b124745491e1c6a836ad81f8e7b4233"} Apr 22 18:56:15.833856 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.833828 2562 scope.go:117] "RemoveContainer" containerID="d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88" Apr 22 18:56:15.833856 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.833774 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh" Apr 22 18:56:15.848718 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.848698 2562 scope.go:117] "RemoveContainer" containerID="b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6" Apr 22 18:56:15.855823 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.855806 2562 scope.go:117] "RemoveContainer" containerID="b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f" Apr 22 18:56:15.860919 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.860900 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh"] Apr 22 18:56:15.863872 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.863852 2562 scope.go:117] "RemoveContainer" containerID="43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee" Apr 22 18:56:15.864370 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.864352 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-5c98d-predictor-685dc8fc49-hcznh"] Apr 22 18:56:15.870348 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.870333 2562 scope.go:117] "RemoveContainer" containerID="d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88" Apr 22 18:56:15.870588 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:56:15.870570 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88\": container with ID starting with d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88 not found: ID does not exist" containerID="d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88" Apr 22 18:56:15.870658 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.870600 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88"} err="failed to get container status \"d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88\": rpc error: code = NotFound desc = could not find container \"d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88\": container with ID starting with d1f3e424961357cc624fe29211d8f32babb418081f1a03a5db859d8b602c5a88 not found: ID does not exist" Apr 22 18:56:15.870658 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.870623 2562 scope.go:117] "RemoveContainer" containerID="b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6" Apr 22 18:56:15.870845 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:56:15.870828 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6\": container with ID starting with b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6 not found: ID does not exist" containerID="b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6" Apr 22 18:56:15.870883 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.870851 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6"} err="failed to get container status \"b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6\": rpc error: code = NotFound desc = could not find container \"b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6\": container with ID starting with b43c0e06bdac7f9bbb7753da7f76f658cf4b7ba225882b1506ce57780cb193a6 not found: ID does not exist" Apr 22 18:56:15.870883 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.870867 2562 scope.go:117] "RemoveContainer" containerID="b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f" Apr 22 18:56:15.871082 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:56:15.871065 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f\": container with ID starting with b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f not found: ID does not exist" containerID="b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f" Apr 22 18:56:15.871157 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.871090 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f"} err="failed to get container status \"b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f\": rpc error: code = NotFound desc = could not find container \"b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f\": container with ID starting with b007ccc64b1b39c6e925d6ac77c33620e5cf4e25107d0e3800a1184dde24b62f not found: ID does not exist" Apr 22 18:56:15.871157 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.871110 2562 scope.go:117] "RemoveContainer" containerID="43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee" Apr 22 18:56:15.871336 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:56:15.871320 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee\": container with ID starting with 43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee not found: ID does not exist" containerID="43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee" Apr 22 18:56:15.871374 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:15.871340 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee"} err="failed to get container status \"43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee\": rpc error: code = NotFound desc = could not find container \"43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee\": container with ID starting with 43152d9807c338054a5845b480c0aa068a69634cd4b71c27007e39dff67299ee not found: ID does not exist" Apr 22 18:56:16.825168 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:16.825140 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:56:16.825822 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:16.825788 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 18:56:16.840367 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:16.840340 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" path="/var/lib/kubelet/pods/9708e338-c6ac-4415-8a67-1c77d5f09062/volumes" Apr 22 18:56:18.763961 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:18.763926 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 18:56:26.825913 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:26.825869 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 18:56:28.763698 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:28.763659 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 18:56:36.826623 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:36.826586 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 18:56:38.763758 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:38.763723 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 18:56:46.826721 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:46.826684 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 18:56:48.763218 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:48.763178 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 18:56:56.825982 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:56.825942 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 18:56:58.764596 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:56:58.764571 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:57:06.826229 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:06.826198 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:57:25.783501 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.783468 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk"] Apr 22 18:57:25.784902 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.783791 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" containerID="cri-o://7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348" gracePeriod=30 Apr 22 18:57:25.784902 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.783842 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kube-rbac-proxy" containerID="cri-o://5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340" gracePeriod=30 Apr 22 18:57:25.844873 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.844849 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr"] Apr 22 18:57:25.845224 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845212 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" Apr 22 18:57:25.845284 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845226 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" Apr 22 18:57:25.845284 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845237 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="storage-initializer" Apr 22 18:57:25.845284 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845243 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="storage-initializer" Apr 22 18:57:25.845284 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845256 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" Apr 22 18:57:25.845284 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845262 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" Apr 22 18:57:25.845284 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845271 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" Apr 22 18:57:25.845284 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845278 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" Apr 22 18:57:25.845482 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845334 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="agent" Apr 22 18:57:25.845482 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845344 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kube-rbac-proxy" Apr 22 18:57:25.845482 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.845352 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="9708e338-c6ac-4415-8a67-1c77d5f09062" containerName="kserve-container" Apr 22 18:57:25.847501 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.847483 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.849722 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.849705 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-ac146-predictor-serving-cert\"" Apr 22 18:57:25.849805 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.849711 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\"" Apr 22 18:57:25.858469 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.858448 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr"] Apr 22 18:57:25.882562 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.882544 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d925cbb-5763-4c71-8df8-05f7ebe04983-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.882666 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.882574 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c2x4\" (UniqueName: \"kubernetes.io/projected/2d925cbb-5763-4c71-8df8-05f7ebe04983-kube-api-access-7c2x4\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.882666 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.882614 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d925cbb-5763-4c71-8df8-05f7ebe04983-isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.882752 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.882667 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d925cbb-5763-4c71-8df8-05f7ebe04983-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.945592 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.945565 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8"] Apr 22 18:57:25.948149 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.948134 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:25.950571 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.950547 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-ac146-predictor-serving-cert\"" Apr 22 18:57:25.950691 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.950550 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\"" Apr 22 18:57:25.955001 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.954980 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67"] Apr 22 18:57:25.955683 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.955636 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" containerID="cri-o://c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f" gracePeriod=30 Apr 22 18:57:25.955906 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.955886 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kube-rbac-proxy" containerID="cri-o://278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3" gracePeriod=30 Apr 22 18:57:25.957665 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.957643 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8"] Apr 22 18:57:25.989557 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.989535 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d925cbb-5763-4c71-8df8-05f7ebe04983-isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.989676 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.989568 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg58h\" (UniqueName: \"kubernetes.io/projected/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kube-api-access-lg58h\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:25.989676 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.989590 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:25.989676 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.989617 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d925cbb-5763-4c71-8df8-05f7ebe04983-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.989835 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.989758 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8df25eb-0c57-424c-96c7-a5685dc15bf8-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:25.989835 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.989798 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d925cbb-5763-4c71-8df8-05f7ebe04983-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.989925 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.989833 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c2x4\" (UniqueName: \"kubernetes.io/projected/2d925cbb-5763-4c71-8df8-05f7ebe04983-kube-api-access-7c2x4\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.989925 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.989902 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8df25eb-0c57-424c-96c7-a5685dc15bf8-isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:25.990040 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.989952 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d925cbb-5763-4c71-8df8-05f7ebe04983-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.990040 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:57:25.989954 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-serving-cert: secret "isvc-sklearn-graph-raw-hpa-ac146-predictor-serving-cert" not found Apr 22 18:57:25.990151 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:57:25.990050 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d925cbb-5763-4c71-8df8-05f7ebe04983-proxy-tls podName:2d925cbb-5763-4c71-8df8-05f7ebe04983 nodeName:}" failed. No retries permitted until 2026-04-22 18:57:26.490032066 +0000 UTC m=+658.169258351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2d925cbb-5763-4c71-8df8-05f7ebe04983-proxy-tls") pod "isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" (UID: "2d925cbb-5763-4c71-8df8-05f7ebe04983") : secret "isvc-sklearn-graph-raw-hpa-ac146-predictor-serving-cert" not found Apr 22 18:57:25.990263 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.990240 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d925cbb-5763-4c71-8df8-05f7ebe04983-isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:25.999842 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:25.999820 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c2x4\" (UniqueName: \"kubernetes.io/projected/2d925cbb-5763-4c71-8df8-05f7ebe04983-kube-api-access-7c2x4\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:26.043559 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.043484 2562 generic.go:358] "Generic (PLEG): container finished" podID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerID="5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340" exitCode=2 Apr 22 18:57:26.043690 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.043556 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" event={"ID":"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32","Type":"ContainerDied","Data":"5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340"} Apr 22 18:57:26.091350 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.091321 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:26.091441 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.091395 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8df25eb-0c57-424c-96c7-a5685dc15bf8-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:26.091489 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.091473 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8df25eb-0c57-424c-96c7-a5685dc15bf8-isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:26.091548 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.091506 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg58h\" (UniqueName: \"kubernetes.io/projected/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kube-api-access-lg58h\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:26.091666 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.091641 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:26.092076 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.092056 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8df25eb-0c57-424c-96c7-a5685dc15bf8-isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:26.093471 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.093453 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8df25eb-0c57-424c-96c7-a5685dc15bf8-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:26.101893 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.101866 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg58h\" (UniqueName: \"kubernetes.io/projected/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kube-api-access-lg58h\") pod \"isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:26.262108 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.262083 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:26.380949 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.380920 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8"] Apr 22 18:57:26.382973 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:57:26.382947 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8df25eb_0c57_424c_96c7_a5685dc15bf8.slice/crio-54a0782cd129b68d21523777782adbdf98d7244bc5625b69c23c015b64e97f5e WatchSource:0}: Error finding container 54a0782cd129b68d21523777782adbdf98d7244bc5625b69c23c015b64e97f5e: Status 404 returned error can't find the container with id 54a0782cd129b68d21523777782adbdf98d7244bc5625b69c23c015b64e97f5e Apr 22 18:57:26.384734 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.384719 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:57:26.494091 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.494062 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d925cbb-5763-4c71-8df8-05f7ebe04983-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:26.496317 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.496288 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d925cbb-5763-4c71-8df8-05f7ebe04983-proxy-tls\") pod \"isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:26.757883 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.757850 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:26.821101 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.821058 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.24:8643/healthz\": dial tcp 10.134.0.24:8643: connect: connection refused" Apr 22 18:57:26.826490 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.826463 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 22 18:57:26.879529 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:26.879495 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr"] Apr 22 18:57:26.881427 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:57:26.881395 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d925cbb_5763_4c71_8df8_05f7ebe04983.slice/crio-96cfd7d518fb5809e20994c93cfbf3fa1b2cf16c6f411b2c2f34b1f45a887cc5 WatchSource:0}: Error finding container 96cfd7d518fb5809e20994c93cfbf3fa1b2cf16c6f411b2c2f34b1f45a887cc5: Status 404 returned error can't find the container with id 96cfd7d518fb5809e20994c93cfbf3fa1b2cf16c6f411b2c2f34b1f45a887cc5 Apr 22 18:57:27.048810 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:27.048722 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" event={"ID":"2d925cbb-5763-4c71-8df8-05f7ebe04983","Type":"ContainerStarted","Data":"4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f"} Apr 22 18:57:27.048810 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:27.048765 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" event={"ID":"2d925cbb-5763-4c71-8df8-05f7ebe04983","Type":"ContainerStarted","Data":"96cfd7d518fb5809e20994c93cfbf3fa1b2cf16c6f411b2c2f34b1f45a887cc5"} Apr 22 18:57:27.050178 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:27.050145 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" event={"ID":"c8df25eb-0c57-424c-96c7-a5685dc15bf8","Type":"ContainerStarted","Data":"e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e"} Apr 22 18:57:27.050178 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:27.050180 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" event={"ID":"c8df25eb-0c57-424c-96c7-a5685dc15bf8","Type":"ContainerStarted","Data":"54a0782cd129b68d21523777782adbdf98d7244bc5625b69c23c015b64e97f5e"} Apr 22 18:57:27.051992 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:27.051970 2562 generic.go:358] "Generic (PLEG): container finished" podID="dc202605-7684-450e-8ffe-953017d94011" containerID="278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3" exitCode=2 Apr 22 18:57:27.052128 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:27.052034 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" event={"ID":"dc202605-7684-450e-8ffe-953017d94011","Type":"ContainerDied","Data":"278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3"} Apr 22 18:57:28.757780 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:28.757741 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.23:8643/healthz\": dial tcp 10.134.0.23:8643: connect: connection refused" Apr 22 18:57:28.763860 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:28.763832 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 22 18:57:29.390268 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.390247 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:57:29.525304 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.525228 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct75m\" (UniqueName: \"kubernetes.io/projected/dc202605-7684-450e-8ffe-953017d94011-kube-api-access-ct75m\") pod \"dc202605-7684-450e-8ffe-953017d94011\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " Apr 22 18:57:29.525304 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.525275 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc202605-7684-450e-8ffe-953017d94011-proxy-tls\") pod \"dc202605-7684-450e-8ffe-953017d94011\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " Apr 22 18:57:29.525496 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.525311 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc202605-7684-450e-8ffe-953017d94011-kserve-provision-location\") pod \"dc202605-7684-450e-8ffe-953017d94011\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " Apr 22 18:57:29.525496 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.525357 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc202605-7684-450e-8ffe-953017d94011-isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\") pod \"dc202605-7684-450e-8ffe-953017d94011\" (UID: \"dc202605-7684-450e-8ffe-953017d94011\") " Apr 22 18:57:29.525698 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.525660 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc202605-7684-450e-8ffe-953017d94011-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dc202605-7684-450e-8ffe-953017d94011" (UID: "dc202605-7684-450e-8ffe-953017d94011"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:29.525810 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.525733 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc202605-7684-450e-8ffe-953017d94011-isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config") pod "dc202605-7684-450e-8ffe-953017d94011" (UID: "dc202605-7684-450e-8ffe-953017d94011"). InnerVolumeSpecName "isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:57:29.527353 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.527326 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc202605-7684-450e-8ffe-953017d94011-kube-api-access-ct75m" (OuterVolumeSpecName: "kube-api-access-ct75m") pod "dc202605-7684-450e-8ffe-953017d94011" (UID: "dc202605-7684-450e-8ffe-953017d94011"). InnerVolumeSpecName "kube-api-access-ct75m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:57:29.527419 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.527400 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc202605-7684-450e-8ffe-953017d94011-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dc202605-7684-450e-8ffe-953017d94011" (UID: "dc202605-7684-450e-8ffe-953017d94011"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:29.625825 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.625795 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ct75m\" (UniqueName: \"kubernetes.io/projected/dc202605-7684-450e-8ffe-953017d94011-kube-api-access-ct75m\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:57:29.625825 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.625815 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dc202605-7684-450e-8ffe-953017d94011-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:57:29.625825 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.625825 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc202605-7684-450e-8ffe-953017d94011-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:57:29.626088 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.625834 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dc202605-7684-450e-8ffe-953017d94011-isvc-xgboost-graph-raw-1fc88-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:57:29.910779 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:29.910750 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:57:30.028831 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.028787 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-proxy-tls\") pod \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " Apr 22 18:57:30.028831 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.028829 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2275x\" (UniqueName: \"kubernetes.io/projected/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kube-api-access-2275x\") pod \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " Apr 22 18:57:30.029105 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.028870 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kserve-provision-location\") pod \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " Apr 22 18:57:30.029105 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.028892 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\") pod \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\" (UID: \"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32\") " Apr 22 18:57:30.029262 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.029238 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" (UID: "c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:30.029353 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.029328 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config") pod "c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" (UID: "c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32"). InnerVolumeSpecName "isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:57:30.030814 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.030791 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" (UID: "c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:30.030926 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.030878 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kube-api-access-2275x" (OuterVolumeSpecName: "kube-api-access-2275x") pod "c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" (UID: "c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32"). InnerVolumeSpecName "kube-api-access-2275x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:57:30.063126 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.063046 2562 generic.go:358] "Generic (PLEG): container finished" podID="dc202605-7684-450e-8ffe-953017d94011" containerID="c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f" exitCode=0 Apr 22 18:57:30.063126 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.063123 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" event={"ID":"dc202605-7684-450e-8ffe-953017d94011","Type":"ContainerDied","Data":"c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f"} Apr 22 18:57:30.063311 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.063147 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" event={"ID":"dc202605-7684-450e-8ffe-953017d94011","Type":"ContainerDied","Data":"e0f379ff125620f0149b79f267f393105b9721c3f72c6289df8668f6f9e77ae1"} Apr 22 18:57:30.063311 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.063162 2562 scope.go:117] "RemoveContainer" containerID="278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3" Apr 22 18:57:30.063311 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.063238 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67" Apr 22 18:57:30.064911 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.064888 2562 generic.go:358] "Generic (PLEG): container finished" podID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerID="7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348" exitCode=0 Apr 22 18:57:30.065047 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.064953 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" Apr 22 18:57:30.065047 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.064963 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" event={"ID":"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32","Type":"ContainerDied","Data":"7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348"} Apr 22 18:57:30.065047 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.064990 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk" event={"ID":"c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32","Type":"ContainerDied","Data":"a3c984d069490b42af5cd649fda89632586c49c8279d8df5f5e9e4a607bace43"} Apr 22 18:57:30.066599 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.066538 2562 generic.go:358] "Generic (PLEG): container finished" podID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerID="e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e" exitCode=0 Apr 22 18:57:30.066599 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.066561 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" event={"ID":"c8df25eb-0c57-424c-96c7-a5685dc15bf8","Type":"ContainerDied","Data":"e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e"} Apr 22 18:57:30.087050 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.087034 2562 scope.go:117] "RemoveContainer" containerID="c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f" Apr 22 18:57:30.113370 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.113353 2562 scope.go:117] "RemoveContainer" containerID="48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b" Apr 22 18:57:30.123268 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.123249 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk"] Apr 22 18:57:30.130226 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.129970 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:57:30.130226 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.129994 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2275x\" (UniqueName: \"kubernetes.io/projected/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kube-api-access-2275x\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:57:30.130226 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.130040 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-1fc88-predictor-64458d7785-k65kk"] Apr 22 18:57:30.130226 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.130008 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:57:30.130226 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.130070 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32-isvc-sklearn-graph-raw-1fc88-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:57:30.132895 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.132625 2562 scope.go:117] "RemoveContainer" containerID="278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3" Apr 22 18:57:30.134383 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:57:30.133326 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3\": container with ID starting with 278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3 not found: ID does not exist" containerID="278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3" Apr 22 18:57:30.134383 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.133374 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3"} err="failed to get container status \"278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3\": rpc error: code = NotFound desc = could not find container \"278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3\": container with ID starting with 278c9018625b47e5741b94bce584f089c6ad24f7ea7feab040f45849d9a7a2c3 not found: ID does not exist" Apr 22 18:57:30.134383 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.133399 2562 scope.go:117] "RemoveContainer" containerID="c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f" Apr 22 18:57:30.137900 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:57:30.136144 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f\": container with ID starting with c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f not found: ID does not exist" containerID="c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f" Apr 22 18:57:30.137900 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.136181 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f"} err="failed to get container status \"c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f\": rpc error: code = NotFound desc = could not find container \"c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f\": container with ID starting with c178f7f26e0c66de6e56cb97ee559d67bc0b95b45dfabc79a8ccefa15d39647f not found: ID does not exist" Apr 22 18:57:30.137900 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.136203 2562 scope.go:117] "RemoveContainer" containerID="48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b" Apr 22 18:57:30.137900 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:57:30.137384 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b\": container with ID starting with 48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b not found: ID does not exist" containerID="48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b" Apr 22 18:57:30.137900 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.137414 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b"} err="failed to get container status \"48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b\": rpc error: code = NotFound desc = could not find container \"48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b\": container with ID starting with 48e7e032940ccf2b4ea37e7842c2cd74d322b4a5aaa73152f3f71ed35a03951b not found: ID does not exist" Apr 22 18:57:30.137900 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.137437 2562 scope.go:117] "RemoveContainer" containerID="5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340" Apr 22 18:57:30.145529 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.145510 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67"] Apr 22 18:57:30.146177 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.146164 2562 scope.go:117] "RemoveContainer" containerID="7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348" Apr 22 18:57:30.149371 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.149352 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-1fc88-predictor-6cb54dd789-9vl67"] Apr 22 18:57:30.154470 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.154457 2562 scope.go:117] "RemoveContainer" containerID="27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2" Apr 22 18:57:30.161938 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.161923 2562 scope.go:117] "RemoveContainer" containerID="5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340" Apr 22 18:57:30.162270 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:57:30.162250 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340\": container with ID starting with 5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340 not found: ID does not exist" containerID="5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340" Apr 22 18:57:30.162349 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.162274 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340"} err="failed to get container status \"5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340\": rpc error: code = NotFound desc = could not find container \"5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340\": container with ID starting with 5d4d1e6f99f6ee14199e0fba3b631a84b54820b007d4104d6d05952d5fc1c340 not found: ID does not exist" Apr 22 18:57:30.162349 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.162289 2562 scope.go:117] "RemoveContainer" containerID="7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348" Apr 22 18:57:30.162530 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:57:30.162514 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348\": container with ID starting with 7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348 not found: ID does not exist" containerID="7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348" Apr 22 18:57:30.162569 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.162535 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348"} err="failed to get container status \"7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348\": rpc error: code = NotFound desc = could not find container \"7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348\": container with ID starting with 7a0dc3b395c03ef2b0ce70a92a2f3e592c11c3405a56aedc355663186a714348 not found: ID does not exist" Apr 22 18:57:30.162569 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.162548 2562 scope.go:117] "RemoveContainer" containerID="27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2" Apr 22 18:57:30.162769 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:57:30.162755 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2\": container with ID starting with 27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2 not found: ID does not exist" containerID="27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2" Apr 22 18:57:30.162812 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.162770 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2"} err="failed to get container status \"27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2\": rpc error: code = NotFound desc = could not find container \"27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2\": container with ID starting with 27bce7fa30d9daa1f9f6ab4a8521af7b3a85cb61ff8aec98fc5c7a0a6167e2d2 not found: ID does not exist" Apr 22 18:57:30.841354 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.841323 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" path="/var/lib/kubelet/pods/c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32/volumes" Apr 22 18:57:30.841808 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:30.841795 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc202605-7684-450e-8ffe-953017d94011" path="/var/lib/kubelet/pods/dc202605-7684-450e-8ffe-953017d94011/volumes" Apr 22 18:57:31.071165 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:31.071135 2562 generic.go:358] "Generic (PLEG): container finished" podID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerID="4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f" exitCode=0 Apr 22 18:57:31.071546 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:31.071199 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" event={"ID":"2d925cbb-5763-4c71-8df8-05f7ebe04983","Type":"ContainerDied","Data":"4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f"} Apr 22 18:57:31.073029 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:31.072998 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" event={"ID":"c8df25eb-0c57-424c-96c7-a5685dc15bf8","Type":"ContainerStarted","Data":"1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be"} Apr 22 18:57:31.073138 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:31.073038 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" event={"ID":"c8df25eb-0c57-424c-96c7-a5685dc15bf8","Type":"ContainerStarted","Data":"7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08"} Apr 22 18:57:31.073331 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:31.073313 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:31.110461 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:31.110421 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podStartSLOduration=6.110407639 podStartE2EDuration="6.110407639s" podCreationTimestamp="2026-04-22 18:57:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:31.108235011 +0000 UTC m=+662.787461303" watchObservedRunningTime="2026-04-22 18:57:31.110407639 +0000 UTC m=+662.789633931" Apr 22 18:57:32.082189 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:32.082154 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" event={"ID":"2d925cbb-5763-4c71-8df8-05f7ebe04983","Type":"ContainerStarted","Data":"4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119"} Apr 22 18:57:32.082189 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:32.082194 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" event={"ID":"2d925cbb-5763-4c71-8df8-05f7ebe04983","Type":"ContainerStarted","Data":"683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa"} Apr 22 18:57:32.082639 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:32.082438 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:32.082639 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:32.082584 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:32.083782 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:32.083750 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 18:57:32.101576 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:32.101531 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podStartSLOduration=7.101515792 podStartE2EDuration="7.101515792s" podCreationTimestamp="2026-04-22 18:57:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:32.099811509 +0000 UTC m=+663.779037801" watchObservedRunningTime="2026-04-22 18:57:32.101515792 +0000 UTC m=+663.780742080" Apr 22 18:57:33.085596 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:33.085565 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:33.086082 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:33.085918 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 18:57:33.086625 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:33.086599 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 22 18:57:34.089453 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:34.089415 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 22 18:57:38.089555 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:38.089525 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:57:38.090127 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:38.090097 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 18:57:39.094282 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:39.094253 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:57:39.094814 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:39.094790 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 22 18:57:48.090133 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:48.090091 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 18:57:49.094961 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:49.094928 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 22 18:57:58.090516 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:58.090481 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 18:57:59.094850 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:57:59.094814 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 22 18:58:08.090787 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:58:08.090740 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 18:58:09.094884 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:58:09.094839 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 22 18:58:18.090706 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:58:18.090612 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 18:58:19.095280 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:58:19.095248 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 22 18:58:28.090986 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:58:28.090959 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:58:29.095330 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:58:29.095244 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 22 18:58:39.095718 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:58:39.095687 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:59:06.041992 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.041956 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr"] Apr 22 18:59:06.042586 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.042276 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" containerID="cri-o://683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa" gracePeriod=30 Apr 22 18:59:06.042586 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.042307 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kube-rbac-proxy" containerID="cri-o://4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119" gracePeriod=30 Apr 22 18:59:06.086780 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.086755 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z"] Apr 22 18:59:06.087211 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087193 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" Apr 22 18:59:06.087309 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087213 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" Apr 22 18:59:06.087309 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087226 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" Apr 22 18:59:06.087309 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087234 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" Apr 22 18:59:06.087309 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087245 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="storage-initializer" Apr 22 18:59:06.087309 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087254 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="storage-initializer" Apr 22 18:59:06.087309 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087283 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kube-rbac-proxy" Apr 22 18:59:06.087309 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087293 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kube-rbac-proxy" Apr 22 18:59:06.087309 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087310 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kube-rbac-proxy" Apr 22 18:59:06.087684 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087319 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kube-rbac-proxy" Apr 22 18:59:06.087684 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087335 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="storage-initializer" Apr 22 18:59:06.087684 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087343 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="storage-initializer" Apr 22 18:59:06.087684 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087456 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kube-rbac-proxy" Apr 22 18:59:06.087684 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087468 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kserve-container" Apr 22 18:59:06.087684 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087482 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c63a00a5-9e6d-47fc-a0f3-5ed88b74dd32" containerName="kube-rbac-proxy" Apr 22 18:59:06.087684 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.087497 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc202605-7684-450e-8ffe-953017d94011" containerName="kserve-container" Apr 22 18:59:06.090867 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.090848 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.093276 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.093249 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-b0d1b-predictor-serving-cert\"" Apr 22 18:59:06.093411 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.093390 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\"" Apr 22 18:59:06.100146 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.100122 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z"] Apr 22 18:59:06.115628 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.115598 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.115748 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.115642 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chmjr\" (UniqueName: \"kubernetes.io/projected/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-kube-api-access-chmjr\") pod \"message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.115748 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.115685 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-proxy-tls\") pod \"message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.129858 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.129833 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8"] Apr 22 18:59:06.130199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.130167 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" containerID="cri-o://7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08" gracePeriod=30 Apr 22 18:59:06.130312 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.130202 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kube-rbac-proxy" containerID="cri-o://1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be" gracePeriod=30 Apr 22 18:59:06.217204 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.217176 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.217321 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.217210 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chmjr\" (UniqueName: \"kubernetes.io/projected/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-kube-api-access-chmjr\") pod \"message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.217321 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.217237 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-proxy-tls\") pod \"message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.217763 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.217737 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\") pod \"message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.219503 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.219476 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-proxy-tls\") pod \"message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.224669 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.224646 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chmjr\" (UniqueName: \"kubernetes.io/projected/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-kube-api-access-chmjr\") pod \"message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.377505 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.377276 2562 generic.go:358] "Generic (PLEG): container finished" podID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerID="4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119" exitCode=2 Apr 22 18:59:06.377505 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.377357 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" event={"ID":"2d925cbb-5763-4c71-8df8-05f7ebe04983","Type":"ContainerDied","Data":"4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119"} Apr 22 18:59:06.380201 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.380177 2562 generic.go:358] "Generic (PLEG): container finished" podID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerID="1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be" exitCode=2 Apr 22 18:59:06.380312 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.380219 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" event={"ID":"c8df25eb-0c57-424c-96c7-a5685dc15bf8","Type":"ContainerDied","Data":"1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be"} Apr 22 18:59:06.401953 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.401934 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:06.518733 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:06.518701 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z"] Apr 22 18:59:06.521495 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:59:06.521469 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod888d1d9a_1b44_4e9f_87ad_7ff15086c9be.slice/crio-712215acc0385371ae12959e75d207bb9721195e3bf788569944b3c0d09b9a5a WatchSource:0}: Error finding container 712215acc0385371ae12959e75d207bb9721195e3bf788569944b3c0d09b9a5a: Status 404 returned error can't find the container with id 712215acc0385371ae12959e75d207bb9721195e3bf788569944b3c0d09b9a5a Apr 22 18:59:07.385253 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:07.385219 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" event={"ID":"888d1d9a-1b44-4e9f-87ad-7ff15086c9be","Type":"ContainerStarted","Data":"712215acc0385371ae12959e75d207bb9721195e3bf788569944b3c0d09b9a5a"} Apr 22 18:59:08.085870 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:08.085830 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.26:8643/healthz\": dial tcp 10.134.0.26:8643: connect: connection refused" Apr 22 18:59:08.090128 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:08.090103 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 22 18:59:08.389590 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:08.389506 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" event={"ID":"888d1d9a-1b44-4e9f-87ad-7ff15086c9be","Type":"ContainerStarted","Data":"543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf"} Apr 22 18:59:08.389590 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:08.389543 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" event={"ID":"888d1d9a-1b44-4e9f-87ad-7ff15086c9be","Type":"ContainerStarted","Data":"215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf"} Apr 22 18:59:08.389998 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:08.389635 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:08.414517 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:08.414465 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" podStartSLOduration=1.4650275179999999 podStartE2EDuration="2.414451473s" podCreationTimestamp="2026-04-22 18:59:06 +0000 UTC" firstStartedPulling="2026-04-22 18:59:06.523314795 +0000 UTC m=+758.202541065" lastFinishedPulling="2026-04-22 18:59:07.472738745 +0000 UTC m=+759.151965020" observedRunningTime="2026-04-22 18:59:08.412758868 +0000 UTC m=+760.091985167" watchObservedRunningTime="2026-04-22 18:59:08.414451473 +0000 UTC m=+760.093677764" Apr 22 18:59:09.090242 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.090196 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.25:8643/healthz\": dial tcp 10.134.0.25:8643: connect: connection refused" Apr 22 18:59:09.094770 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.094747 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 22 18:59:09.392377 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.392276 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:09.394139 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.394119 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:09.660444 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.660423 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:59:09.749107 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.749068 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg58h\" (UniqueName: \"kubernetes.io/projected/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kube-api-access-lg58h\") pod \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " Apr 22 18:59:09.749255 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.749179 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8df25eb-0c57-424c-96c7-a5685dc15bf8-proxy-tls\") pod \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " Apr 22 18:59:09.749255 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.749211 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kserve-provision-location\") pod \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " Apr 22 18:59:09.749255 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.749230 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8df25eb-0c57-424c-96c7-a5685dc15bf8-isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") pod \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\" (UID: \"c8df25eb-0c57-424c-96c7-a5685dc15bf8\") " Apr 22 18:59:09.749558 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.749530 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c8df25eb-0c57-424c-96c7-a5685dc15bf8" (UID: "c8df25eb-0c57-424c-96c7-a5685dc15bf8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:09.749649 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.749605 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8df25eb-0c57-424c-96c7-a5685dc15bf8-isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config") pod "c8df25eb-0c57-424c-96c7-a5685dc15bf8" (UID: "c8df25eb-0c57-424c-96c7-a5685dc15bf8"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:59:09.751076 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.751051 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kube-api-access-lg58h" (OuterVolumeSpecName: "kube-api-access-lg58h") pod "c8df25eb-0c57-424c-96c7-a5685dc15bf8" (UID: "c8df25eb-0c57-424c-96c7-a5685dc15bf8"). InnerVolumeSpecName "kube-api-access-lg58h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:59:09.751164 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.751150 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8df25eb-0c57-424c-96c7-a5685dc15bf8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c8df25eb-0c57-424c-96c7-a5685dc15bf8" (UID: "c8df25eb-0c57-424c-96c7-a5685dc15bf8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:59:09.850811 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.850787 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8df25eb-0c57-424c-96c7-a5685dc15bf8-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:59:09.850811 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.850811 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:59:09.850962 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.850822 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c8df25eb-0c57-424c-96c7-a5685dc15bf8-isvc-xgboost-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:59:09.850962 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:09.850834 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lg58h\" (UniqueName: \"kubernetes.io/projected/c8df25eb-0c57-424c-96c7-a5685dc15bf8-kube-api-access-lg58h\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:59:10.165423 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.165402 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:59:10.254767 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.254737 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d925cbb-5763-4c71-8df8-05f7ebe04983-proxy-tls\") pod \"2d925cbb-5763-4c71-8df8-05f7ebe04983\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " Apr 22 18:59:10.254927 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.254789 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d925cbb-5763-4c71-8df8-05f7ebe04983-kserve-provision-location\") pod \"2d925cbb-5763-4c71-8df8-05f7ebe04983\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " Apr 22 18:59:10.254927 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.254843 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d925cbb-5763-4c71-8df8-05f7ebe04983-isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") pod \"2d925cbb-5763-4c71-8df8-05f7ebe04983\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " Apr 22 18:59:10.254927 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.254876 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c2x4\" (UniqueName: \"kubernetes.io/projected/2d925cbb-5763-4c71-8df8-05f7ebe04983-kube-api-access-7c2x4\") pod \"2d925cbb-5763-4c71-8df8-05f7ebe04983\" (UID: \"2d925cbb-5763-4c71-8df8-05f7ebe04983\") " Apr 22 18:59:10.255199 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.255169 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d925cbb-5763-4c71-8df8-05f7ebe04983-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2d925cbb-5763-4c71-8df8-05f7ebe04983" (UID: "2d925cbb-5763-4c71-8df8-05f7ebe04983"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:10.255277 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.255251 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d925cbb-5763-4c71-8df8-05f7ebe04983-isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config") pod "2d925cbb-5763-4c71-8df8-05f7ebe04983" (UID: "2d925cbb-5763-4c71-8df8-05f7ebe04983"). InnerVolumeSpecName "isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:59:10.256915 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.256892 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d925cbb-5763-4c71-8df8-05f7ebe04983-kube-api-access-7c2x4" (OuterVolumeSpecName: "kube-api-access-7c2x4") pod "2d925cbb-5763-4c71-8df8-05f7ebe04983" (UID: "2d925cbb-5763-4c71-8df8-05f7ebe04983"). InnerVolumeSpecName "kube-api-access-7c2x4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:59:10.256991 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.256922 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d925cbb-5763-4c71-8df8-05f7ebe04983-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2d925cbb-5763-4c71-8df8-05f7ebe04983" (UID: "2d925cbb-5763-4c71-8df8-05f7ebe04983"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:59:10.356410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.356340 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d925cbb-5763-4c71-8df8-05f7ebe04983-isvc-sklearn-graph-raw-hpa-ac146-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:59:10.356410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.356373 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7c2x4\" (UniqueName: \"kubernetes.io/projected/2d925cbb-5763-4c71-8df8-05f7ebe04983-kube-api-access-7c2x4\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:59:10.356410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.356384 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d925cbb-5763-4c71-8df8-05f7ebe04983-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:59:10.356410 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.356393 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d925cbb-5763-4c71-8df8-05f7ebe04983-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 18:59:10.396101 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.396076 2562 generic.go:358] "Generic (PLEG): container finished" podID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerID="683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa" exitCode=0 Apr 22 18:59:10.396438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.396146 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" event={"ID":"2d925cbb-5763-4c71-8df8-05f7ebe04983","Type":"ContainerDied","Data":"683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa"} Apr 22 18:59:10.396438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.396171 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" event={"ID":"2d925cbb-5763-4c71-8df8-05f7ebe04983","Type":"ContainerDied","Data":"96cfd7d518fb5809e20994c93cfbf3fa1b2cf16c6f411b2c2f34b1f45a887cc5"} Apr 22 18:59:10.396438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.396178 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr" Apr 22 18:59:10.396438 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.396193 2562 scope.go:117] "RemoveContainer" containerID="4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119" Apr 22 18:59:10.397906 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.397886 2562 generic.go:358] "Generic (PLEG): container finished" podID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerID="7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08" exitCode=0 Apr 22 18:59:10.398049 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.397968 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" Apr 22 18:59:10.398049 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.397964 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" event={"ID":"c8df25eb-0c57-424c-96c7-a5685dc15bf8","Type":"ContainerDied","Data":"7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08"} Apr 22 18:59:10.398049 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.398047 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8" event={"ID":"c8df25eb-0c57-424c-96c7-a5685dc15bf8","Type":"ContainerDied","Data":"54a0782cd129b68d21523777782adbdf98d7244bc5625b69c23c015b64e97f5e"} Apr 22 18:59:10.404822 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.404801 2562 scope.go:117] "RemoveContainer" containerID="683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa" Apr 22 18:59:10.414045 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.414029 2562 scope.go:117] "RemoveContainer" containerID="4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f" Apr 22 18:59:10.421534 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.421514 2562 scope.go:117] "RemoveContainer" containerID="4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119" Apr 22 18:59:10.421783 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:59:10.421767 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119\": container with ID starting with 4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119 not found: ID does not exist" containerID="4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119" Apr 22 18:59:10.421857 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.421793 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119"} err="failed to get container status \"4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119\": rpc error: code = NotFound desc = could not find container \"4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119\": container with ID starting with 4272b845f444b79051a9b6e9802194aa91a98d5bbb2db910ef5c01a657063119 not found: ID does not exist" Apr 22 18:59:10.421857 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.421816 2562 scope.go:117] "RemoveContainer" containerID="683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa" Apr 22 18:59:10.422129 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:59:10.422104 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa\": container with ID starting with 683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa not found: ID does not exist" containerID="683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa" Apr 22 18:59:10.422230 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.422136 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa"} err="failed to get container status \"683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa\": rpc error: code = NotFound desc = could not find container \"683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa\": container with ID starting with 683464fd6d3db832a17dc76af415be23d86f0233dc447c9417409311e17b9aaa not found: ID does not exist" Apr 22 18:59:10.422230 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.422157 2562 scope.go:117] "RemoveContainer" containerID="4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f" Apr 22 18:59:10.422367 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.422347 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8"] Apr 22 18:59:10.422473 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:59:10.422398 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f\": container with ID starting with 4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f not found: ID does not exist" containerID="4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f" Apr 22 18:59:10.422473 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.422424 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f"} err="failed to get container status \"4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f\": rpc error: code = NotFound desc = could not find container \"4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f\": container with ID starting with 4c85dcdba44bb44bd93d801dba85e275e4771b5c0b083f3865b5a324dedd179f not found: ID does not exist" Apr 22 18:59:10.422473 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.422447 2562 scope.go:117] "RemoveContainer" containerID="1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be" Apr 22 18:59:10.426781 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.426761 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-ac146-predictor-5fbdc999d9-rkch8"] Apr 22 18:59:10.427628 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:59:10.427601 2562 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8df25eb_0c57_424c_96c7_a5685dc15bf8.slice/crio-54a0782cd129b68d21523777782adbdf98d7244bc5625b69c23c015b64e97f5e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d925cbb_5763_4c71_8df8_05f7ebe04983.slice/crio-96cfd7d518fb5809e20994c93cfbf3fa1b2cf16c6f411b2c2f34b1f45a887cc5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d925cbb_5763_4c71_8df8_05f7ebe04983.slice\": RecentStats: unable to find data in memory cache]" Apr 22 18:59:10.429571 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.429555 2562 scope.go:117] "RemoveContainer" containerID="7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08" Apr 22 18:59:10.436297 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.436284 2562 scope.go:117] "RemoveContainer" containerID="e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e" Apr 22 18:59:10.438852 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.438831 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr"] Apr 22 18:59:10.443342 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.443326 2562 scope.go:117] "RemoveContainer" containerID="1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be" Apr 22 18:59:10.443604 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:59:10.443586 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be\": container with ID starting with 1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be not found: ID does not exist" containerID="1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be" Apr 22 18:59:10.443670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.443609 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be"} err="failed to get container status \"1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be\": rpc error: code = NotFound desc = could not find container \"1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be\": container with ID starting with 1eb88a0fd05ac399ceb02fb88f982fe84df7cf6936158ad52fc6b95febe2a2be not found: ID does not exist" Apr 22 18:59:10.443670 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.443633 2562 scope.go:117] "RemoveContainer" containerID="7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08" Apr 22 18:59:10.443895 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:59:10.443877 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08\": container with ID starting with 7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08 not found: ID does not exist" containerID="7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08" Apr 22 18:59:10.443954 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.443901 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08"} err="failed to get container status \"7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08\": rpc error: code = NotFound desc = could not find container \"7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08\": container with ID starting with 7534222e0a67e7465f966acb29e35eea9064d54bca6b868c72215f0f66d44e08 not found: ID does not exist" Apr 22 18:59:10.443954 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.443916 2562 scope.go:117] "RemoveContainer" containerID="e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e" Apr 22 18:59:10.444394 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:59:10.444377 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e\": container with ID starting with e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e not found: ID does not exist" containerID="e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e" Apr 22 18:59:10.444457 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.444401 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e"} err="failed to get container status \"e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e\": rpc error: code = NotFound desc = could not find container \"e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e\": container with ID starting with e93b3303817e43563ff5de5185461e86c93b4075ed94d0ff106aa926ae3ccf2e not found: ID does not exist" Apr 22 18:59:10.444606 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.444588 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-ac146-predictor-5b98784765-7f2jr"] Apr 22 18:59:10.840827 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.840795 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" path="/var/lib/kubelet/pods/2d925cbb-5763-4c71-8df8-05f7ebe04983/volumes" Apr 22 18:59:10.841329 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:10.841314 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" path="/var/lib/kubelet/pods/c8df25eb-0c57-424c-96c7-a5685dc15bf8/volumes" Apr 22 18:59:16.406956 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:16.406926 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 18:59:26.136911 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.136871 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd"] Apr 22 18:59:26.137445 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137416 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" Apr 22 18:59:26.137445 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137435 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" Apr 22 18:59:26.137445 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137445 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kube-rbac-proxy" Apr 22 18:59:26.137601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137455 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kube-rbac-proxy" Apr 22 18:59:26.137601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137468 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="storage-initializer" Apr 22 18:59:26.137601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137477 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="storage-initializer" Apr 22 18:59:26.137601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137501 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="storage-initializer" Apr 22 18:59:26.137601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137510 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="storage-initializer" Apr 22 18:59:26.137601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137519 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" Apr 22 18:59:26.137601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137527 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" Apr 22 18:59:26.137601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137542 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kube-rbac-proxy" Apr 22 18:59:26.137601 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137550 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kube-rbac-proxy" Apr 22 18:59:26.138066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137631 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kube-rbac-proxy" Apr 22 18:59:26.138066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137647 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8df25eb-0c57-424c-96c7-a5685dc15bf8" containerName="kserve-container" Apr 22 18:59:26.138066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137658 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kserve-container" Apr 22 18:59:26.138066 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.137670 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d925cbb-5763-4c71-8df8-05f7ebe04983" containerName="kube-rbac-proxy" Apr 22 18:59:26.141464 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.141440 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.144158 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.144129 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-b0d1b-predictor-serving-cert\"" Apr 22 18:59:26.144298 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.144180 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\"" Apr 22 18:59:26.151425 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.151400 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd"] Apr 22 18:59:26.180505 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.180461 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/084e4830-5629-40e2-8868-401e1eec0590-proxy-tls\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.180685 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.180513 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/084e4830-5629-40e2-8868-401e1eec0590-isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.180685 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.180605 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slwrh\" (UniqueName: \"kubernetes.io/projected/084e4830-5629-40e2-8868-401e1eec0590-kube-api-access-slwrh\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.180685 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.180640 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/084e4830-5629-40e2-8868-401e1eec0590-kserve-provision-location\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.281964 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.281934 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/084e4830-5629-40e2-8868-401e1eec0590-proxy-tls\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.282154 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.281973 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/084e4830-5629-40e2-8868-401e1eec0590-isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.282154 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.281999 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slwrh\" (UniqueName: \"kubernetes.io/projected/084e4830-5629-40e2-8868-401e1eec0590-kube-api-access-slwrh\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.282154 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.282053 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/084e4830-5629-40e2-8868-401e1eec0590-kserve-provision-location\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.282154 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:59:26.282101 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-serving-cert: secret "isvc-logger-raw-b0d1b-predictor-serving-cert" not found Apr 22 18:59:26.282338 ip-10-0-128-208 kubenswrapper[2562]: E0422 18:59:26.282174 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/084e4830-5629-40e2-8868-401e1eec0590-proxy-tls podName:084e4830-5629-40e2-8868-401e1eec0590 nodeName:}" failed. No retries permitted until 2026-04-22 18:59:26.782154586 +0000 UTC m=+778.461380858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/084e4830-5629-40e2-8868-401e1eec0590-proxy-tls") pod "isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" (UID: "084e4830-5629-40e2-8868-401e1eec0590") : secret "isvc-logger-raw-b0d1b-predictor-serving-cert" not found Apr 22 18:59:26.282470 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.282451 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/084e4830-5629-40e2-8868-401e1eec0590-kserve-provision-location\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.282660 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.282643 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/084e4830-5629-40e2-8868-401e1eec0590-isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.292281 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.292259 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slwrh\" (UniqueName: \"kubernetes.io/projected/084e4830-5629-40e2-8868-401e1eec0590-kube-api-access-slwrh\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.785501 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.785466 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/084e4830-5629-40e2-8868-401e1eec0590-proxy-tls\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:26.787824 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:26.787798 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/084e4830-5629-40e2-8868-401e1eec0590-proxy-tls\") pod \"isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:27.054767 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:27.054696 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:27.173464 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:27.173442 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd"] Apr 22 18:59:27.175442 ip-10-0-128-208 kubenswrapper[2562]: W0422 18:59:27.175416 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod084e4830_5629_40e2_8868_401e1eec0590.slice/crio-5a5abb5a22d321d2d03f5a7979c6d56e18d8228feedb35da7e26ae750cf0b838 WatchSource:0}: Error finding container 5a5abb5a22d321d2d03f5a7979c6d56e18d8228feedb35da7e26ae750cf0b838: Status 404 returned error can't find the container with id 5a5abb5a22d321d2d03f5a7979c6d56e18d8228feedb35da7e26ae750cf0b838 Apr 22 18:59:27.454613 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:27.454575 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerStarted","Data":"e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435"} Apr 22 18:59:27.454777 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:27.454618 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerStarted","Data":"5a5abb5a22d321d2d03f5a7979c6d56e18d8228feedb35da7e26ae750cf0b838"} Apr 22 18:59:31.467977 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:31.467942 2562 generic.go:358] "Generic (PLEG): container finished" podID="084e4830-5629-40e2-8868-401e1eec0590" containerID="e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435" exitCode=0 Apr 22 18:59:31.468351 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:31.468036 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerDied","Data":"e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435"} Apr 22 18:59:32.472966 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:32.472930 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerStarted","Data":"fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c"} Apr 22 18:59:32.472966 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:32.472970 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerStarted","Data":"1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7"} Apr 22 18:59:32.473365 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:32.472980 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerStarted","Data":"3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69"} Apr 22 18:59:32.473432 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:32.473363 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:32.473507 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:32.473489 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:32.474893 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:32.474867 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 18:59:32.493668 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:32.493628 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podStartSLOduration=6.493617257 podStartE2EDuration="6.493617257s" podCreationTimestamp="2026-04-22 18:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:59:32.492010775 +0000 UTC m=+784.171237065" watchObservedRunningTime="2026-04-22 18:59:32.493617257 +0000 UTC m=+784.172843540" Apr 22 18:59:33.476067 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:33.476006 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:33.476472 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:33.476150 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 18:59:33.477097 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:33.477061 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:34.479109 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:34.479071 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 18:59:34.479566 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:34.479502 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:39.483793 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:39.483765 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 18:59:39.484316 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:39.484284 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 18:59:39.484733 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:39.484713 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:49.484815 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:49.484776 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 18:59:49.485303 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:49.485281 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:59.484708 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:59.484663 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 18:59:59.485219 ip-10-0-128-208 kubenswrapper[2562]: I0422 18:59:59.485142 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:09.484275 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:09.484233 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:00:09.484725 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:09.484673 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:19.485158 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:19.485121 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:00:19.485619 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:19.485595 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:29.484759 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:29.484712 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:00:29.485203 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:29.485182 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:39.485204 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:39.485176 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 19:00:39.485683 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:39.485271 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 19:00:51.141485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.141453 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z_888d1d9a-1b44-4e9f-87ad-7ff15086c9be/kserve-container/0.log" Apr 22 19:00:51.311937 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.311905 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd"] Apr 22 19:00:51.312315 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.312263 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" containerID="cri-o://3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69" gracePeriod=30 Apr 22 19:00:51.312475 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.312293 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" containerID="cri-o://fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c" gracePeriod=30 Apr 22 19:00:51.312555 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.312333 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" containerID="cri-o://1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7" gracePeriod=30 Apr 22 19:00:51.353293 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.353267 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6"] Apr 22 19:00:51.358629 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.358613 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.361145 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.361124 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\"" Apr 22 19:00:51.361145 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.361125 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-scale-raw-99910-predictor-serving-cert\"" Apr 22 19:00:51.367566 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.367544 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6"] Apr 22 19:00:51.386644 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.386614 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1992b2ee-b78e-4965-afe9-1912692b664d-isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.386774 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.386742 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89s6b\" (UniqueName: \"kubernetes.io/projected/1992b2ee-b78e-4965-afe9-1912692b664d-kube-api-access-89s6b\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.386834 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.386779 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1992b2ee-b78e-4965-afe9-1912692b664d-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.386898 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.386879 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1992b2ee-b78e-4965-afe9-1912692b664d-proxy-tls\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.408546 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.408481 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z"] Apr 22 19:00:51.408761 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.408740 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" podUID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerName="kserve-container" containerID="cri-o://215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf" gracePeriod=30 Apr 22 19:00:51.408847 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.408761 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" podUID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerName="kube-rbac-proxy" containerID="cri-o://543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf" gracePeriod=30 Apr 22 19:00:51.488228 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.488200 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1992b2ee-b78e-4965-afe9-1912692b664d-isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.488331 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.488260 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89s6b\" (UniqueName: \"kubernetes.io/projected/1992b2ee-b78e-4965-afe9-1912692b664d-kube-api-access-89s6b\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.488331 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.488280 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1992b2ee-b78e-4965-afe9-1912692b664d-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.488331 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.488321 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1992b2ee-b78e-4965-afe9-1912692b664d-proxy-tls\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.488480 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:00:51.488415 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-serving-cert: secret "isvc-sklearn-scale-raw-99910-predictor-serving-cert" not found Apr 22 19:00:51.488480 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:00:51.488472 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1992b2ee-b78e-4965-afe9-1912692b664d-proxy-tls podName:1992b2ee-b78e-4965-afe9-1912692b664d nodeName:}" failed. No retries permitted until 2026-04-22 19:00:51.988453629 +0000 UTC m=+863.667679898 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1992b2ee-b78e-4965-afe9-1912692b664d-proxy-tls") pod "isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" (UID: "1992b2ee-b78e-4965-afe9-1912692b664d") : secret "isvc-sklearn-scale-raw-99910-predictor-serving-cert" not found Apr 22 19:00:51.488683 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.488661 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1992b2ee-b78e-4965-afe9-1912692b664d-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.488869 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.488851 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1992b2ee-b78e-4965-afe9-1912692b664d-isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.496549 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.496526 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89s6b\" (UniqueName: \"kubernetes.io/projected/1992b2ee-b78e-4965-afe9-1912692b664d-kube-api-access-89s6b\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.645162 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.645137 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 19:00:51.690620 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.690554 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-proxy-tls\") pod \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " Apr 22 19:00:51.690747 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.690620 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\") pod \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " Apr 22 19:00:51.690747 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.690651 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chmjr\" (UniqueName: \"kubernetes.io/projected/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-kube-api-access-chmjr\") pod \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\" (UID: \"888d1d9a-1b44-4e9f-87ad-7ff15086c9be\") " Apr 22 19:00:51.691051 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.691000 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config") pod "888d1d9a-1b44-4e9f-87ad-7ff15086c9be" (UID: "888d1d9a-1b44-4e9f-87ad-7ff15086c9be"). InnerVolumeSpecName "message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:00:51.692685 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.692657 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "888d1d9a-1b44-4e9f-87ad-7ff15086c9be" (UID: "888d1d9a-1b44-4e9f-87ad-7ff15086c9be"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:00:51.692791 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.692764 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-kube-api-access-chmjr" (OuterVolumeSpecName: "kube-api-access-chmjr") pod "888d1d9a-1b44-4e9f-87ad-7ff15086c9be" (UID: "888d1d9a-1b44-4e9f-87ad-7ff15086c9be"). InnerVolumeSpecName "kube-api-access-chmjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:00:51.714132 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.714107 2562 generic.go:358] "Generic (PLEG): container finished" podID="084e4830-5629-40e2-8868-401e1eec0590" containerID="1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7" exitCode=2 Apr 22 19:00:51.714229 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.714182 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerDied","Data":"1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7"} Apr 22 19:00:51.715459 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.715437 2562 generic.go:358] "Generic (PLEG): container finished" podID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerID="543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf" exitCode=2 Apr 22 19:00:51.715459 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.715457 2562 generic.go:358] "Generic (PLEG): container finished" podID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerID="215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf" exitCode=2 Apr 22 19:00:51.715577 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.715502 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" Apr 22 19:00:51.715577 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.715504 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" event={"ID":"888d1d9a-1b44-4e9f-87ad-7ff15086c9be","Type":"ContainerDied","Data":"543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf"} Apr 22 19:00:51.715577 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.715533 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" event={"ID":"888d1d9a-1b44-4e9f-87ad-7ff15086c9be","Type":"ContainerDied","Data":"215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf"} Apr 22 19:00:51.715577 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.715545 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z" event={"ID":"888d1d9a-1b44-4e9f-87ad-7ff15086c9be","Type":"ContainerDied","Data":"712215acc0385371ae12959e75d207bb9721195e3bf788569944b3c0d09b9a5a"} Apr 22 19:00:51.715577 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.715561 2562 scope.go:117] "RemoveContainer" containerID="543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf" Apr 22 19:00:51.724267 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.724248 2562 scope.go:117] "RemoveContainer" containerID="215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf" Apr 22 19:00:51.731926 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.731910 2562 scope.go:117] "RemoveContainer" containerID="543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf" Apr 22 19:00:51.732198 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:00:51.732179 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf\": container with ID starting with 543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf not found: ID does not exist" containerID="543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf" Apr 22 19:00:51.732269 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.732204 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf"} err="failed to get container status \"543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf\": rpc error: code = NotFound desc = could not find container \"543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf\": container with ID starting with 543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf not found: ID does not exist" Apr 22 19:00:51.732269 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.732220 2562 scope.go:117] "RemoveContainer" containerID="215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf" Apr 22 19:00:51.732499 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:00:51.732481 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf\": container with ID starting with 215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf not found: ID does not exist" containerID="215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf" Apr 22 19:00:51.732564 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.732507 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf"} err="failed to get container status \"215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf\": rpc error: code = NotFound desc = could not find container \"215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf\": container with ID starting with 215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf not found: ID does not exist" Apr 22 19:00:51.732564 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.732528 2562 scope.go:117] "RemoveContainer" containerID="543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf" Apr 22 19:00:51.732749 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.732733 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf"} err="failed to get container status \"543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf\": rpc error: code = NotFound desc = could not find container \"543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf\": container with ID starting with 543d639584bd4585e052da66df514c04ac4621d81603645c994f35bfe3481bbf not found: ID does not exist" Apr 22 19:00:51.732798 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.732750 2562 scope.go:117] "RemoveContainer" containerID="215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf" Apr 22 19:00:51.732973 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.732957 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf"} err="failed to get container status \"215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf\": rpc error: code = NotFound desc = could not find container \"215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf\": container with ID starting with 215abf82f90cac4313922fdca0f9a78ba5c3366ffb91358f6630a37e1a5364bf not found: ID does not exist" Apr 22 19:00:51.736904 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.736849 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z"] Apr 22 19:00:51.738989 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.738971 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-b0d1b-predictor-68cf68f497-5rb5z"] Apr 22 19:00:51.792196 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.792177 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:00:51.792280 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.792198 2562 reconciler_common.go:299] "Volume detached for volume \"message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-message-dumper-raw-b0d1b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:00:51.792280 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.792208 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-chmjr\" (UniqueName: \"kubernetes.io/projected/888d1d9a-1b44-4e9f-87ad-7ff15086c9be-kube-api-access-chmjr\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:00:51.993844 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.993821 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1992b2ee-b78e-4965-afe9-1912692b664d-proxy-tls\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:51.996322 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:51.996302 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1992b2ee-b78e-4965-afe9-1912692b664d-proxy-tls\") pod \"isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:52.269270 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:52.269176 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:52.391436 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:52.391330 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6"] Apr 22 19:00:52.393840 ip-10-0-128-208 kubenswrapper[2562]: W0422 19:00:52.393815 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1992b2ee_b78e_4965_afe9_1912692b664d.slice/crio-d6c2149eb4b983361c3e52048f1d3a610b4577292b9d02edbed6e77f07cf1890 WatchSource:0}: Error finding container d6c2149eb4b983361c3e52048f1d3a610b4577292b9d02edbed6e77f07cf1890: Status 404 returned error can't find the container with id d6c2149eb4b983361c3e52048f1d3a610b4577292b9d02edbed6e77f07cf1890 Apr 22 19:00:52.720334 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:52.720295 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" event={"ID":"1992b2ee-b78e-4965-afe9-1912692b664d","Type":"ContainerStarted","Data":"fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39"} Apr 22 19:00:52.720334 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:52.720336 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" event={"ID":"1992b2ee-b78e-4965-afe9-1912692b664d","Type":"ContainerStarted","Data":"d6c2149eb4b983361c3e52048f1d3a610b4577292b9d02edbed6e77f07cf1890"} Apr 22 19:00:52.840966 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:52.840934 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" path="/var/lib/kubelet/pods/888d1d9a-1b44-4e9f-87ad-7ff15086c9be/volumes" Apr 22 19:00:54.479839 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:54.479802 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 22 19:00:55.731466 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:55.731436 2562 generic.go:358] "Generic (PLEG): container finished" podID="084e4830-5629-40e2-8868-401e1eec0590" containerID="3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69" exitCode=0 Apr 22 19:00:55.731849 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:55.731505 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerDied","Data":"3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69"} Apr 22 19:00:56.735203 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:56.735167 2562 generic.go:358] "Generic (PLEG): container finished" podID="1992b2ee-b78e-4965-afe9-1912692b664d" containerID="fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39" exitCode=0 Apr 22 19:00:56.735563 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:56.735240 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" event={"ID":"1992b2ee-b78e-4965-afe9-1912692b664d","Type":"ContainerDied","Data":"fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39"} Apr 22 19:00:57.740039 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:57.739990 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" event={"ID":"1992b2ee-b78e-4965-afe9-1912692b664d","Type":"ContainerStarted","Data":"07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9"} Apr 22 19:00:57.740406 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:57.740053 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" event={"ID":"1992b2ee-b78e-4965-afe9-1912692b664d","Type":"ContainerStarted","Data":"b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652"} Apr 22 19:00:57.740406 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:57.740368 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:57.740496 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:57.740478 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:00:57.741841 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:57.741815 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:00:57.759408 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:57.759346 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podStartSLOduration=6.7593314509999995 podStartE2EDuration="6.759331451s" podCreationTimestamp="2026-04-22 19:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:00:57.758493284 +0000 UTC m=+869.437719588" watchObservedRunningTime="2026-04-22 19:00:57.759331451 +0000 UTC m=+869.438557744" Apr 22 19:00:58.742940 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:58.742903 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:00:59.479272 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:59.479232 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 22 19:00:59.484618 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:59.484586 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:00:59.484901 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:00:59.484878 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:01:03.747341 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:03.747313 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:01:03.747960 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:03.747930 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:01:04.479542 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:04.479502 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 22 19:01:04.479725 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:04.479627 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 19:01:09.479841 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:09.479797 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 22 19:01:09.484244 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:09.484211 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:01:09.484601 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:09.484580 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:01:13.748921 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:13.748840 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:01:14.480062 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:14.480000 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 22 19:01:19.479330 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:19.479287 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.28:8643/healthz\": dial tcp 10.134.0.28:8643: connect: connection refused" Apr 22 19:01:19.484647 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:19.484626 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 22 19:01:19.484753 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:19.484739 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 19:01:19.485083 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:19.485058 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:01:19.485196 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:19.485151 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 19:01:21.449129 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.449107 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 19:01:21.538732 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.538699 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/084e4830-5629-40e2-8868-401e1eec0590-kserve-provision-location\") pod \"084e4830-5629-40e2-8868-401e1eec0590\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " Apr 22 19:01:21.538882 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.538791 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/084e4830-5629-40e2-8868-401e1eec0590-isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\") pod \"084e4830-5629-40e2-8868-401e1eec0590\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " Apr 22 19:01:21.538882 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.538828 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/084e4830-5629-40e2-8868-401e1eec0590-proxy-tls\") pod \"084e4830-5629-40e2-8868-401e1eec0590\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " Apr 22 19:01:21.538882 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.538873 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slwrh\" (UniqueName: \"kubernetes.io/projected/084e4830-5629-40e2-8868-401e1eec0590-kube-api-access-slwrh\") pod \"084e4830-5629-40e2-8868-401e1eec0590\" (UID: \"084e4830-5629-40e2-8868-401e1eec0590\") " Apr 22 19:01:21.539052 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.538982 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/084e4830-5629-40e2-8868-401e1eec0590-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "084e4830-5629-40e2-8868-401e1eec0590" (UID: "084e4830-5629-40e2-8868-401e1eec0590"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:01:21.539170 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.539153 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/084e4830-5629-40e2-8868-401e1eec0590-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:01:21.539210 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.539175 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/084e4830-5629-40e2-8868-401e1eec0590-isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config") pod "084e4830-5629-40e2-8868-401e1eec0590" (UID: "084e4830-5629-40e2-8868-401e1eec0590"). InnerVolumeSpecName "isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:01:21.540867 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.540842 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084e4830-5629-40e2-8868-401e1eec0590-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "084e4830-5629-40e2-8868-401e1eec0590" (UID: "084e4830-5629-40e2-8868-401e1eec0590"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:01:21.540959 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.540891 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/084e4830-5629-40e2-8868-401e1eec0590-kube-api-access-slwrh" (OuterVolumeSpecName: "kube-api-access-slwrh") pod "084e4830-5629-40e2-8868-401e1eec0590" (UID: "084e4830-5629-40e2-8868-401e1eec0590"). InnerVolumeSpecName "kube-api-access-slwrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:01:21.640433 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.640364 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/084e4830-5629-40e2-8868-401e1eec0590-isvc-logger-raw-b0d1b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:01:21.640433 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.640391 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/084e4830-5629-40e2-8868-401e1eec0590-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:01:21.640433 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.640402 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-slwrh\" (UniqueName: \"kubernetes.io/projected/084e4830-5629-40e2-8868-401e1eec0590-kube-api-access-slwrh\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:01:21.810509 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.810472 2562 generic.go:358] "Generic (PLEG): container finished" podID="084e4830-5629-40e2-8868-401e1eec0590" containerID="fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c" exitCode=0 Apr 22 19:01:21.810679 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.810554 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerDied","Data":"fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c"} Apr 22 19:01:21.810679 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.810594 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" Apr 22 19:01:21.810679 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.810599 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd" event={"ID":"084e4830-5629-40e2-8868-401e1eec0590","Type":"ContainerDied","Data":"5a5abb5a22d321d2d03f5a7979c6d56e18d8228feedb35da7e26ae750cf0b838"} Apr 22 19:01:21.810679 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.810621 2562 scope.go:117] "RemoveContainer" containerID="fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c" Apr 22 19:01:21.818636 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.818622 2562 scope.go:117] "RemoveContainer" containerID="1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7" Apr 22 19:01:21.825626 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.825610 2562 scope.go:117] "RemoveContainer" containerID="3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69" Apr 22 19:01:21.832203 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.832183 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd"] Apr 22 19:01:21.833106 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.833083 2562 scope.go:117] "RemoveContainer" containerID="e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435" Apr 22 19:01:21.836000 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.835979 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-b0d1b-predictor-6d6cfbb95c-4vphd"] Apr 22 19:01:21.839864 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.839848 2562 scope.go:117] "RemoveContainer" containerID="fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c" Apr 22 19:01:21.840127 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:01:21.840108 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c\": container with ID starting with fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c not found: ID does not exist" containerID="fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c" Apr 22 19:01:21.840179 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.840137 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c"} err="failed to get container status \"fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c\": rpc error: code = NotFound desc = could not find container \"fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c\": container with ID starting with fec62c044ee599c44a9af6fa21811f160ce5b3ca0f0b460708bd0ee0034e4e4c not found: ID does not exist" Apr 22 19:01:21.840179 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.840154 2562 scope.go:117] "RemoveContainer" containerID="1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7" Apr 22 19:01:21.840386 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:01:21.840369 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7\": container with ID starting with 1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7 not found: ID does not exist" containerID="1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7" Apr 22 19:01:21.840449 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.840396 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7"} err="failed to get container status \"1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7\": rpc error: code = NotFound desc = could not find container \"1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7\": container with ID starting with 1a7cfe2e508f141e9de4f2ffa2e23d2ed8b236d02897669fc0935e5a2f81a7d7 not found: ID does not exist" Apr 22 19:01:21.840449 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.840417 2562 scope.go:117] "RemoveContainer" containerID="3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69" Apr 22 19:01:21.840636 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:01:21.840620 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69\": container with ID starting with 3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69 not found: ID does not exist" containerID="3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69" Apr 22 19:01:21.840677 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.840641 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69"} err="failed to get container status \"3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69\": rpc error: code = NotFound desc = could not find container \"3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69\": container with ID starting with 3b02737d2ddf218e73236952ec13e42e7d39139ebbe58abe6f67be291f994f69 not found: ID does not exist" Apr 22 19:01:21.840677 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.840665 2562 scope.go:117] "RemoveContainer" containerID="e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435" Apr 22 19:01:21.840858 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:01:21.840843 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435\": container with ID starting with e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435 not found: ID does not exist" containerID="e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435" Apr 22 19:01:21.840902 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:21.840865 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435"} err="failed to get container status \"e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435\": rpc error: code = NotFound desc = could not find container \"e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435\": container with ID starting with e4585850e7b946fc29b414b0a9eb758ab2195dc887a49c1467cdf0d1fe966435 not found: ID does not exist" Apr 22 19:01:22.840520 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:22.840493 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="084e4830-5629-40e2-8868-401e1eec0590" path="/var/lib/kubelet/pods/084e4830-5629-40e2-8868-401e1eec0590/volumes" Apr 22 19:01:23.748850 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:23.748814 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:01:33.748887 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:33.748848 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:01:43.748685 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:43.748647 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:01:53.747967 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:01:53.747931 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:02:03.748865 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:02:03.748827 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:02:04.837652 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:02:04.837615 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:02:14.837926 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:02:14.837885 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:02:24.838058 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:02:24.837996 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:02:34.837673 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:02:34.837623 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:02:44.838415 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:02:44.838323 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:02:54.837669 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:02:54.837626 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:03:04.837684 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:04.837635 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:03:14.840504 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:14.840479 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:03:21.500002 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.499964 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6"] Apr 22 19:03:21.500433 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.500303 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" containerID="cri-o://b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652" gracePeriod=30 Apr 22 19:03:21.500433 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.500331 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kube-rbac-proxy" containerID="cri-o://07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9" gracePeriod=30 Apr 22 19:03:21.600178 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600148 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp"] Apr 22 19:03:21.600488 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600470 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" Apr 22 19:03:21.600488 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600484 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" Apr 22 19:03:21.600588 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600522 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" Apr 22 19:03:21.600588 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600531 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" Apr 22 19:03:21.600588 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600544 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerName="kube-rbac-proxy" Apr 22 19:03:21.600588 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600554 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerName="kube-rbac-proxy" Apr 22 19:03:21.600588 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600569 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="storage-initializer" Apr 22 19:03:21.600588 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600576 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="storage-initializer" Apr 22 19:03:21.600588 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600591 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerName="kserve-container" Apr 22 19:03:21.600588 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600596 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerName="kserve-container" Apr 22 19:03:21.600848 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600610 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" Apr 22 19:03:21.600848 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600615 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" Apr 22 19:03:21.600848 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600672 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerName="kube-rbac-proxy" Apr 22 19:03:21.600848 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600680 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="888d1d9a-1b44-4e9f-87ad-7ff15086c9be" containerName="kserve-container" Apr 22 19:03:21.600848 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600686 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kube-rbac-proxy" Apr 22 19:03:21.600848 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600691 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="agent" Apr 22 19:03:21.600848 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.600697 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="084e4830-5629-40e2-8868-401e1eec0590" containerName="kserve-container" Apr 22 19:03:21.603790 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.603775 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.606055 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.606023 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-fb7532-predictor-serving-cert\"" Apr 22 19:03:21.606166 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.606154 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-fb7532-kube-rbac-proxy-sar-config\"" Apr 22 19:03:21.612955 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.612932 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp"] Apr 22 19:03:21.715025 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.714984 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kserve-provision-location\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.715135 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.715043 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g787l\" (UniqueName: \"kubernetes.io/projected/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kube-api-access-g787l\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.715135 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.715096 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb5f39c-b96d-48df-a77e-bed36a074a5c-proxy-tls\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.715135 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.715121 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fb5f39c-b96d-48df-a77e-bed36a074a5c-isvc-primary-fb7532-kube-rbac-proxy-sar-config\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.815678 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.815616 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kserve-provision-location\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.815678 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.815649 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g787l\" (UniqueName: \"kubernetes.io/projected/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kube-api-access-g787l\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.815828 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.815680 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb5f39c-b96d-48df-a77e-bed36a074a5c-proxy-tls\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.815828 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.815723 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fb5f39c-b96d-48df-a77e-bed36a074a5c-isvc-primary-fb7532-kube-rbac-proxy-sar-config\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.815926 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:03:21.815846 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-fb7532-predictor-serving-cert: secret "isvc-primary-fb7532-predictor-serving-cert" not found Apr 22 19:03:21.815926 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:03:21.815918 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fb5f39c-b96d-48df-a77e-bed36a074a5c-proxy-tls podName:0fb5f39c-b96d-48df-a77e-bed36a074a5c nodeName:}" failed. No retries permitted until 2026-04-22 19:03:22.315896216 +0000 UTC m=+1013.995122500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0fb5f39c-b96d-48df-a77e-bed36a074a5c-proxy-tls") pod "isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" (UID: "0fb5f39c-b96d-48df-a77e-bed36a074a5c") : secret "isvc-primary-fb7532-predictor-serving-cert" not found Apr 22 19:03:21.816006 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.815987 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kserve-provision-location\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.816444 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.816423 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fb5f39c-b96d-48df-a77e-bed36a074a5c-isvc-primary-fb7532-kube-rbac-proxy-sar-config\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:21.826885 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:21.826854 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g787l\" (UniqueName: \"kubernetes.io/projected/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kube-api-access-g787l\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:22.169421 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:22.169338 2562 generic.go:358] "Generic (PLEG): container finished" podID="1992b2ee-b78e-4965-afe9-1912692b664d" containerID="07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9" exitCode=2 Apr 22 19:03:22.169563 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:22.169415 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" event={"ID":"1992b2ee-b78e-4965-afe9-1912692b664d","Type":"ContainerDied","Data":"07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9"} Apr 22 19:03:22.319262 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:22.319235 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb5f39c-b96d-48df-a77e-bed36a074a5c-proxy-tls\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:22.321535 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:22.321517 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb5f39c-b96d-48df-a77e-bed36a074a5c-proxy-tls\") pod \"isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:22.515743 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:22.515707 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:22.637188 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:22.637163 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp"] Apr 22 19:03:22.639660 ip-10-0-128-208 kubenswrapper[2562]: W0422 19:03:22.639621 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fb5f39c_b96d_48df_a77e_bed36a074a5c.slice/crio-457f7afbdb390ebf9ace7c266b1d75ac5b4567f1a6a52472e5cdd37aeb5a303f WatchSource:0}: Error finding container 457f7afbdb390ebf9ace7c266b1d75ac5b4567f1a6a52472e5cdd37aeb5a303f: Status 404 returned error can't find the container with id 457f7afbdb390ebf9ace7c266b1d75ac5b4567f1a6a52472e5cdd37aeb5a303f Apr 22 19:03:22.641446 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:22.641427 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:03:23.174264 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:23.174224 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" event={"ID":"0fb5f39c-b96d-48df-a77e-bed36a074a5c","Type":"ContainerStarted","Data":"68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c"} Apr 22 19:03:23.174264 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:23.174268 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" event={"ID":"0fb5f39c-b96d-48df-a77e-bed36a074a5c","Type":"ContainerStarted","Data":"457f7afbdb390ebf9ace7c266b1d75ac5b4567f1a6a52472e5cdd37aeb5a303f"} Apr 22 19:03:23.743534 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:23.743484 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 22 19:03:24.838141 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:24.838104 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 22 19:03:27.187148 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:27.187113 2562 generic.go:358] "Generic (PLEG): container finished" podID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerID="68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c" exitCode=0 Apr 22 19:03:27.187512 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:27.187176 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" event={"ID":"0fb5f39c-b96d-48df-a77e-bed36a074a5c","Type":"ContainerDied","Data":"68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c"} Apr 22 19:03:28.192223 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:28.192191 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" event={"ID":"0fb5f39c-b96d-48df-a77e-bed36a074a5c","Type":"ContainerStarted","Data":"896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047"} Apr 22 19:03:28.192642 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:28.192232 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" event={"ID":"0fb5f39c-b96d-48df-a77e-bed36a074a5c","Type":"ContainerStarted","Data":"ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b"} Apr 22 19:03:28.192642 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:28.192509 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:28.192642 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:28.192630 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:28.193707 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:28.193679 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 19:03:28.210844 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:28.210807 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podStartSLOduration=7.210796309 podStartE2EDuration="7.210796309s" podCreationTimestamp="2026-04-22 19:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:03:28.209105389 +0000 UTC m=+1019.888331678" watchObservedRunningTime="2026-04-22 19:03:28.210796309 +0000 UTC m=+1019.890022605" Apr 22 19:03:28.743306 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:28.743268 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.29:8643/healthz\": dial tcp 10.134.0.29:8643: connect: connection refused" Apr 22 19:03:29.195169 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:29.195142 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 19:03:30.133642 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.133618 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:03:30.199590 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.199559 2562 generic.go:358] "Generic (PLEG): container finished" podID="1992b2ee-b78e-4965-afe9-1912692b664d" containerID="b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652" exitCode=0 Apr 22 19:03:30.199959 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.199643 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" Apr 22 19:03:30.199959 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.199651 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" event={"ID":"1992b2ee-b78e-4965-afe9-1912692b664d","Type":"ContainerDied","Data":"b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652"} Apr 22 19:03:30.199959 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.199699 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6" event={"ID":"1992b2ee-b78e-4965-afe9-1912692b664d","Type":"ContainerDied","Data":"d6c2149eb4b983361c3e52048f1d3a610b4577292b9d02edbed6e77f07cf1890"} Apr 22 19:03:30.199959 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.199722 2562 scope.go:117] "RemoveContainer" containerID="07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9" Apr 22 19:03:30.207095 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.207071 2562 scope.go:117] "RemoveContainer" containerID="b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652" Apr 22 19:03:30.213682 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.213667 2562 scope.go:117] "RemoveContainer" containerID="fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39" Apr 22 19:03:30.220200 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.220184 2562 scope.go:117] "RemoveContainer" containerID="07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9" Apr 22 19:03:30.220431 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:03:30.220412 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9\": container with ID starting with 07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9 not found: ID does not exist" containerID="07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9" Apr 22 19:03:30.220476 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.220440 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9"} err="failed to get container status \"07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9\": rpc error: code = NotFound desc = could not find container \"07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9\": container with ID starting with 07bd336c357d9c0cacc398a172dcb2292101786110f2a3a1f8fc621f8e611ac9 not found: ID does not exist" Apr 22 19:03:30.220476 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.220458 2562 scope.go:117] "RemoveContainer" containerID="b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652" Apr 22 19:03:30.220680 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:03:30.220664 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652\": container with ID starting with b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652 not found: ID does not exist" containerID="b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652" Apr 22 19:03:30.220720 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.220685 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652"} err="failed to get container status \"b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652\": rpc error: code = NotFound desc = could not find container \"b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652\": container with ID starting with b099bb58aa4df404824da9bd31000b0952edd2b4851308aab2ac5433d367e652 not found: ID does not exist" Apr 22 19:03:30.220720 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.220699 2562 scope.go:117] "RemoveContainer" containerID="fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39" Apr 22 19:03:30.220918 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:03:30.220903 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39\": container with ID starting with fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39 not found: ID does not exist" containerID="fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39" Apr 22 19:03:30.220970 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.220921 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39"} err="failed to get container status \"fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39\": rpc error: code = NotFound desc = could not find container \"fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39\": container with ID starting with fbca50afd8924f26b855e00846f56bfa17e21c826d7882002a06db36bdf85a39 not found: ID does not exist" Apr 22 19:03:30.289086 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.289054 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89s6b\" (UniqueName: \"kubernetes.io/projected/1992b2ee-b78e-4965-afe9-1912692b664d-kube-api-access-89s6b\") pod \"1992b2ee-b78e-4965-afe9-1912692b664d\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " Apr 22 19:03:30.289223 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.289144 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1992b2ee-b78e-4965-afe9-1912692b664d-proxy-tls\") pod \"1992b2ee-b78e-4965-afe9-1912692b664d\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " Apr 22 19:03:30.289223 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.289178 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1992b2ee-b78e-4965-afe9-1912692b664d-kserve-provision-location\") pod \"1992b2ee-b78e-4965-afe9-1912692b664d\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " Apr 22 19:03:30.289323 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.289241 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1992b2ee-b78e-4965-afe9-1912692b664d-isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\") pod \"1992b2ee-b78e-4965-afe9-1912692b664d\" (UID: \"1992b2ee-b78e-4965-afe9-1912692b664d\") " Apr 22 19:03:30.289535 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.289512 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1992b2ee-b78e-4965-afe9-1912692b664d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1992b2ee-b78e-4965-afe9-1912692b664d" (UID: "1992b2ee-b78e-4965-afe9-1912692b664d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:03:30.289631 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.289607 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1992b2ee-b78e-4965-afe9-1912692b664d-isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config") pod "1992b2ee-b78e-4965-afe9-1912692b664d" (UID: "1992b2ee-b78e-4965-afe9-1912692b664d"). InnerVolumeSpecName "isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:03:30.291198 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.291179 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1992b2ee-b78e-4965-afe9-1912692b664d-kube-api-access-89s6b" (OuterVolumeSpecName: "kube-api-access-89s6b") pod "1992b2ee-b78e-4965-afe9-1912692b664d" (UID: "1992b2ee-b78e-4965-afe9-1912692b664d"). InnerVolumeSpecName "kube-api-access-89s6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:03:30.291261 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.291229 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1992b2ee-b78e-4965-afe9-1912692b664d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1992b2ee-b78e-4965-afe9-1912692b664d" (UID: "1992b2ee-b78e-4965-afe9-1912692b664d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:03:30.390538 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.390516 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1992b2ee-b78e-4965-afe9-1912692b664d-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:03:30.390538 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.390539 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1992b2ee-b78e-4965-afe9-1912692b664d-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:03:30.390666 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.390550 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1992b2ee-b78e-4965-afe9-1912692b664d-isvc-sklearn-scale-raw-99910-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:03:30.390666 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.390561 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-89s6b\" (UniqueName: \"kubernetes.io/projected/1992b2ee-b78e-4965-afe9-1912692b664d-kube-api-access-89s6b\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:03:30.523159 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.523129 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6"] Apr 22 19:03:30.527860 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.527841 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-99910-predictor-58f57f8787-7mbw6"] Apr 22 19:03:30.847730 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:30.847635 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" path="/var/lib/kubelet/pods/1992b2ee-b78e-4965-afe9-1912692b664d/volumes" Apr 22 19:03:34.199407 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:34.199382 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:03:34.199911 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:34.199884 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 19:03:44.200805 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:44.200762 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 19:03:54.200561 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:03:54.200519 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 19:04:04.200544 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:04.200503 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 19:04:14.200853 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:14.200770 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 19:04:24.200560 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:24.200520 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 22 19:04:34.200855 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:34.200823 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:04:41.719779 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.719747 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj"] Apr 22 19:04:41.720168 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.720153 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" Apr 22 19:04:41.720168 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.720169 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" Apr 22 19:04:41.720246 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.720187 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kube-rbac-proxy" Apr 22 19:04:41.720246 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.720193 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kube-rbac-proxy" Apr 22 19:04:41.720246 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.720205 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="storage-initializer" Apr 22 19:04:41.720246 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.720214 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="storage-initializer" Apr 22 19:04:41.720363 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.720273 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kserve-container" Apr 22 19:04:41.720363 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.720287 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="1992b2ee-b78e-4965-afe9-1912692b664d" containerName="kube-rbac-proxy" Apr 22 19:04:41.724590 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.724575 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.727102 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.727073 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-fb7532-predictor-serving-cert\"" Apr 22 19:04:41.727246 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.727106 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-fb7532-kube-rbac-proxy-sar-config\"" Apr 22 19:04:41.727246 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.727076 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-fb7532\"" Apr 22 19:04:41.727246 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.727073 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 19:04:41.727467 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.727454 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-fb7532-dockercfg-7k28f\"" Apr 22 19:04:41.732833 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.732812 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj"] Apr 22 19:04:41.766772 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.766746 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-cabundle-cert\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.766873 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.766782 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-isvc-secondary-fb7532-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.766873 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.766852 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c939f87a-cc37-4796-b91c-115915090815-proxy-tls\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.766957 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.766879 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmzfk\" (UniqueName: \"kubernetes.io/projected/c939f87a-cc37-4796-b91c-115915090815-kube-api-access-tmzfk\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.766957 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.766917 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c939f87a-cc37-4796-b91c-115915090815-kserve-provision-location\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.867620 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.867589 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c939f87a-cc37-4796-b91c-115915090815-proxy-tls\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.867773 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.867627 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmzfk\" (UniqueName: \"kubernetes.io/projected/c939f87a-cc37-4796-b91c-115915090815-kube-api-access-tmzfk\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.867773 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:04:41.867743 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-serving-cert: secret "isvc-secondary-fb7532-predictor-serving-cert" not found Apr 22 19:04:41.867894 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.867766 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c939f87a-cc37-4796-b91c-115915090815-kserve-provision-location\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.867894 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:04:41.867809 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c939f87a-cc37-4796-b91c-115915090815-proxy-tls podName:c939f87a-cc37-4796-b91c-115915090815 nodeName:}" failed. No retries permitted until 2026-04-22 19:04:42.367792801 +0000 UTC m=+1094.047019071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c939f87a-cc37-4796-b91c-115915090815-proxy-tls") pod "isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" (UID: "c939f87a-cc37-4796-b91c-115915090815") : secret "isvc-secondary-fb7532-predictor-serving-cert" not found Apr 22 19:04:41.867894 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.867844 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-cabundle-cert\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.867894 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.867889 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-isvc-secondary-fb7532-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.868303 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.868211 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c939f87a-cc37-4796-b91c-115915090815-kserve-provision-location\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.868452 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.868435 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-cabundle-cert\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.868488 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.868465 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-isvc-secondary-fb7532-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:41.875977 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:41.875957 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmzfk\" (UniqueName: \"kubernetes.io/projected/c939f87a-cc37-4796-b91c-115915090815-kube-api-access-tmzfk\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:42.372074 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:42.372037 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c939f87a-cc37-4796-b91c-115915090815-proxy-tls\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:42.374485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:42.374463 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c939f87a-cc37-4796-b91c-115915090815-proxy-tls\") pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:42.635806 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:42.635709 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:04:42.756529 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:42.756504 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj"] Apr 22 19:04:42.759266 ip-10-0-128-208 kubenswrapper[2562]: W0422 19:04:42.759239 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc939f87a_cc37_4796_b91c_115915090815.slice/crio-28bba24a4c48374cf24876f93edc12d13d8f5b644d1781aa0dcce0dac61ac756 WatchSource:0}: Error finding container 28bba24a4c48374cf24876f93edc12d13d8f5b644d1781aa0dcce0dac61ac756: Status 404 returned error can't find the container with id 28bba24a4c48374cf24876f93edc12d13d8f5b644d1781aa0dcce0dac61ac756 Apr 22 19:04:43.431358 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:43.431322 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" event={"ID":"c939f87a-cc37-4796-b91c-115915090815","Type":"ContainerStarted","Data":"9c3b601f62dfec47ca7a7bf7b32a25d74f93f9771e387c17885b40652cac9d27"} Apr 22 19:04:43.431358 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:43.431356 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" event={"ID":"c939f87a-cc37-4796-b91c-115915090815","Type":"ContainerStarted","Data":"28bba24a4c48374cf24876f93edc12d13d8f5b644d1781aa0dcce0dac61ac756"} Apr 22 19:04:49.451552 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:49.451524 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_c939f87a-cc37-4796-b91c-115915090815/storage-initializer/0.log" Apr 22 19:04:49.451901 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:49.451563 2562 generic.go:358] "Generic (PLEG): container finished" podID="c939f87a-cc37-4796-b91c-115915090815" containerID="9c3b601f62dfec47ca7a7bf7b32a25d74f93f9771e387c17885b40652cac9d27" exitCode=1 Apr 22 19:04:49.451901 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:49.451596 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" event={"ID":"c939f87a-cc37-4796-b91c-115915090815","Type":"ContainerDied","Data":"9c3b601f62dfec47ca7a7bf7b32a25d74f93f9771e387c17885b40652cac9d27"} Apr 22 19:04:50.456181 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:50.456153 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_c939f87a-cc37-4796-b91c-115915090815/storage-initializer/0.log" Apr 22 19:04:50.456599 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:50.456232 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" event={"ID":"c939f87a-cc37-4796-b91c-115915090815","Type":"ContainerStarted","Data":"830d8068878b5b796abc8d475189ea7cbfe46ca5c2b34e8245cdf0822a80de4a"} Apr 22 19:04:56.475269 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:56.475241 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_c939f87a-cc37-4796-b91c-115915090815/storage-initializer/1.log" Apr 22 19:04:56.475691 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:56.475624 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_c939f87a-cc37-4796-b91c-115915090815/storage-initializer/0.log" Apr 22 19:04:56.475691 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:56.475665 2562 generic.go:358] "Generic (PLEG): container finished" podID="c939f87a-cc37-4796-b91c-115915090815" containerID="830d8068878b5b796abc8d475189ea7cbfe46ca5c2b34e8245cdf0822a80de4a" exitCode=1 Apr 22 19:04:56.475810 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:56.475716 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" event={"ID":"c939f87a-cc37-4796-b91c-115915090815","Type":"ContainerDied","Data":"830d8068878b5b796abc8d475189ea7cbfe46ca5c2b34e8245cdf0822a80de4a"} Apr 22 19:04:56.475810 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:56.475759 2562 scope.go:117] "RemoveContainer" containerID="9c3b601f62dfec47ca7a7bf7b32a25d74f93f9771e387c17885b40652cac9d27" Apr 22 19:04:56.476222 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:56.476201 2562 scope.go:117] "RemoveContainer" containerID="9c3b601f62dfec47ca7a7bf7b32a25d74f93f9771e387c17885b40652cac9d27" Apr 22 19:04:56.485995 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:04:56.485967 2562 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_kserve-ci-e2e-test_c939f87a-cc37-4796-b91c-115915090815_0 in pod sandbox 28bba24a4c48374cf24876f93edc12d13d8f5b644d1781aa0dcce0dac61ac756 from index: no such id: '9c3b601f62dfec47ca7a7bf7b32a25d74f93f9771e387c17885b40652cac9d27'" containerID="9c3b601f62dfec47ca7a7bf7b32a25d74f93f9771e387c17885b40652cac9d27" Apr 22 19:04:56.486085 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:04:56.486039 2562 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_kserve-ci-e2e-test_c939f87a-cc37-4796-b91c-115915090815_0 in pod sandbox 28bba24a4c48374cf24876f93edc12d13d8f5b644d1781aa0dcce0dac61ac756 from index: no such id: '9c3b601f62dfec47ca7a7bf7b32a25d74f93f9771e387c17885b40652cac9d27'; Skipping pod \"isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_kserve-ci-e2e-test(c939f87a-cc37-4796-b91c-115915090815)\"" logger="UnhandledError" Apr 22 19:04:56.487337 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:04:56.487318 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_kserve-ci-e2e-test(c939f87a-cc37-4796-b91c-115915090815)\"" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" podUID="c939f87a-cc37-4796-b91c-115915090815" Apr 22 19:04:57.480057 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:57.480007 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_c939f87a-cc37-4796-b91c-115915090815/storage-initializer/1.log" Apr 22 19:04:59.810905 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.810870 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj"] Apr 22 19:04:59.858824 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.858791 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp"] Apr 22 19:04:59.859282 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.859231 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" containerID="cri-o://ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b" gracePeriod=30 Apr 22 19:04:59.859526 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.859287 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kube-rbac-proxy" containerID="cri-o://896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047" gracePeriod=30 Apr 22 19:04:59.922499 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.922469 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6"] Apr 22 19:04:59.927497 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.927476 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:04:59.930139 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.930115 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-afbf81-predictor-serving-cert\"" Apr 22 19:04:59.930139 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.930115 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\"" Apr 22 19:04:59.930294 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.930251 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-afbf81\"" Apr 22 19:04:59.930294 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.930257 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-afbf81-dockercfg-czzqg\"" Apr 22 19:04:59.934316 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.934293 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6"] Apr 22 19:04:59.981354 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.981336 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_c939f87a-cc37-4796-b91c-115915090815/storage-initializer/1.log" Apr 22 19:04:59.981432 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:04:59.981402 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:05:00.016795 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.016766 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36effd3-460e-4a67-bce0-d94829ce0106-kserve-provision-location\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.016942 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.016815 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.016942 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.016849 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-cabundle-cert\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.016942 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.016889 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c36effd3-460e-4a67-bce0-d94829ce0106-proxy-tls\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.017136 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.016984 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h2pk\" (UniqueName: \"kubernetes.io/projected/c36effd3-460e-4a67-bce0-d94829ce0106-kube-api-access-8h2pk\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.118362 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118278 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c939f87a-cc37-4796-b91c-115915090815-proxy-tls\") pod \"c939f87a-cc37-4796-b91c-115915090815\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " Apr 22 19:05:00.118362 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118333 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-cabundle-cert\") pod \"c939f87a-cc37-4796-b91c-115915090815\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " Apr 22 19:05:00.118581 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118360 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-isvc-secondary-fb7532-kube-rbac-proxy-sar-config\") pod \"c939f87a-cc37-4796-b91c-115915090815\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " Apr 22 19:05:00.118581 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118386 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmzfk\" (UniqueName: \"kubernetes.io/projected/c939f87a-cc37-4796-b91c-115915090815-kube-api-access-tmzfk\") pod \"c939f87a-cc37-4796-b91c-115915090815\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " Apr 22 19:05:00.118581 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118470 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c939f87a-cc37-4796-b91c-115915090815-kserve-provision-location\") pod \"c939f87a-cc37-4796-b91c-115915090815\" (UID: \"c939f87a-cc37-4796-b91c-115915090815\") " Apr 22 19:05:00.118729 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118636 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36effd3-460e-4a67-bce0-d94829ce0106-kserve-provision-location\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.118729 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118685 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.118840 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118731 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-cabundle-cert\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.118840 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118757 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c939f87a-cc37-4796-b91c-115915090815-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c939f87a-cc37-4796-b91c-115915090815" (UID: "c939f87a-cc37-4796-b91c-115915090815"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:00.118840 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118766 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c36effd3-460e-4a67-bce0-d94829ce0106-proxy-tls\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.118840 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118767 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c939f87a-cc37-4796-b91c-115915090815" (UID: "c939f87a-cc37-4796-b91c-115915090815"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:05:00.118840 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118787 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-isvc-secondary-fb7532-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-fb7532-kube-rbac-proxy-sar-config") pod "c939f87a-cc37-4796-b91c-115915090815" (UID: "c939f87a-cc37-4796-b91c-115915090815"). InnerVolumeSpecName "isvc-secondary-fb7532-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:05:00.119123 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118845 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h2pk\" (UniqueName: \"kubernetes.io/projected/c36effd3-460e-4a67-bce0-d94829ce0106-kube-api-access-8h2pk\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.119123 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:05:00.118859 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-serving-cert: secret "isvc-init-fail-afbf81-predictor-serving-cert" not found Apr 22 19:05:00.119123 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:05:00.118919 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c36effd3-460e-4a67-bce0-d94829ce0106-proxy-tls podName:c36effd3-460e-4a67-bce0-d94829ce0106 nodeName:}" failed. No retries permitted until 2026-04-22 19:05:00.618898968 +0000 UTC m=+1112.298125238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c36effd3-460e-4a67-bce0-d94829ce0106-proxy-tls") pod "isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" (UID: "c36effd3-460e-4a67-bce0-d94829ce0106") : secret "isvc-init-fail-afbf81-predictor-serving-cert" not found Apr 22 19:05:00.119123 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118975 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c939f87a-cc37-4796-b91c-115915090815-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:00.119123 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.118992 2562 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-cabundle-cert\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:00.119123 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.119010 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36effd3-460e-4a67-bce0-d94829ce0106-kserve-provision-location\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.119123 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.119008 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c939f87a-cc37-4796-b91c-115915090815-isvc-secondary-fb7532-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:00.119520 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.119499 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.119561 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.119526 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-cabundle-cert\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.120623 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.120601 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c939f87a-cc37-4796-b91c-115915090815-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c939f87a-cc37-4796-b91c-115915090815" (UID: "c939f87a-cc37-4796-b91c-115915090815"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:05:00.120814 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.120795 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c939f87a-cc37-4796-b91c-115915090815-kube-api-access-tmzfk" (OuterVolumeSpecName: "kube-api-access-tmzfk") pod "c939f87a-cc37-4796-b91c-115915090815" (UID: "c939f87a-cc37-4796-b91c-115915090815"). InnerVolumeSpecName "kube-api-access-tmzfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:05:00.129157 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.129136 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h2pk\" (UniqueName: \"kubernetes.io/projected/c36effd3-460e-4a67-bce0-d94829ce0106-kube-api-access-8h2pk\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.219995 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.219963 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c939f87a-cc37-4796-b91c-115915090815-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:00.219995 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.219991 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmzfk\" (UniqueName: \"kubernetes.io/projected/c939f87a-cc37-4796-b91c-115915090815-kube-api-access-tmzfk\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:00.490363 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.490328 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-fb7532-predictor-68c4c86b9-77xgj_c939f87a-cc37-4796-b91c-115915090815/storage-initializer/1.log" Apr 22 19:05:00.490564 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.490430 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" event={"ID":"c939f87a-cc37-4796-b91c-115915090815","Type":"ContainerDied","Data":"28bba24a4c48374cf24876f93edc12d13d8f5b644d1781aa0dcce0dac61ac756"} Apr 22 19:05:00.490564 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.490469 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj" Apr 22 19:05:00.490564 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.490472 2562 scope.go:117] "RemoveContainer" containerID="830d8068878b5b796abc8d475189ea7cbfe46ca5c2b34e8245cdf0822a80de4a" Apr 22 19:05:00.492868 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.492841 2562 generic.go:358] "Generic (PLEG): container finished" podID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerID="896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047" exitCode=2 Apr 22 19:05:00.492991 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.492909 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" event={"ID":"0fb5f39c-b96d-48df-a77e-bed36a074a5c","Type":"ContainerDied","Data":"896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047"} Apr 22 19:05:00.526971 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.526949 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj"] Apr 22 19:05:00.530745 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.530722 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-fb7532-predictor-68c4c86b9-77xgj"] Apr 22 19:05:00.622388 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.622365 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c36effd3-460e-4a67-bce0-d94829ce0106-proxy-tls\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.624636 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.624612 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c36effd3-460e-4a67-bce0-d94829ce0106-proxy-tls\") pod \"isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.841415 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.841338 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c939f87a-cc37-4796-b91c-115915090815" path="/var/lib/kubelet/pods/c939f87a-cc37-4796-b91c-115915090815/volumes" Apr 22 19:05:00.842699 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.842679 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:00.963743 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:00.963673 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6"] Apr 22 19:05:00.966307 ip-10-0-128-208 kubenswrapper[2562]: W0422 19:05:00.966275 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc36effd3_460e_4a67_bce0_d94829ce0106.slice/crio-b98a09ed692eee33cdfb1befda7bdeb1b35ffa4c7cc73b0292af78da9f203634 WatchSource:0}: Error finding container b98a09ed692eee33cdfb1befda7bdeb1b35ffa4c7cc73b0292af78da9f203634: Status 404 returned error can't find the container with id b98a09ed692eee33cdfb1befda7bdeb1b35ffa4c7cc73b0292af78da9f203634 Apr 22 19:05:01.497849 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:01.497816 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" event={"ID":"c36effd3-460e-4a67-bce0-d94829ce0106","Type":"ContainerStarted","Data":"ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291"} Apr 22 19:05:01.498054 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:01.497855 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" event={"ID":"c36effd3-460e-4a67-bce0-d94829ce0106","Type":"ContainerStarted","Data":"b98a09ed692eee33cdfb1befda7bdeb1b35ffa4c7cc73b0292af78da9f203634"} Apr 22 19:05:04.099262 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.099240 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:05:04.254060 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.254031 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kserve-provision-location\") pod \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " Apr 22 19:05:04.254252 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.254076 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g787l\" (UniqueName: \"kubernetes.io/projected/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kube-api-access-g787l\") pod \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " Apr 22 19:05:04.254325 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.254249 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb5f39c-b96d-48df-a77e-bed36a074a5c-proxy-tls\") pod \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " Apr 22 19:05:04.254386 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.254320 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fb5f39c-b96d-48df-a77e-bed36a074a5c-isvc-primary-fb7532-kube-rbac-proxy-sar-config\") pod \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\" (UID: \"0fb5f39c-b96d-48df-a77e-bed36a074a5c\") " Apr 22 19:05:04.254435 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.254379 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0fb5f39c-b96d-48df-a77e-bed36a074a5c" (UID: "0fb5f39c-b96d-48df-a77e-bed36a074a5c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:04.254678 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.254654 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:04.254678 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.254660 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fb5f39c-b96d-48df-a77e-bed36a074a5c-isvc-primary-fb7532-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-fb7532-kube-rbac-proxy-sar-config") pod "0fb5f39c-b96d-48df-a77e-bed36a074a5c" (UID: "0fb5f39c-b96d-48df-a77e-bed36a074a5c"). InnerVolumeSpecName "isvc-primary-fb7532-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:05:04.256220 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.256196 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kube-api-access-g787l" (OuterVolumeSpecName: "kube-api-access-g787l") pod "0fb5f39c-b96d-48df-a77e-bed36a074a5c" (UID: "0fb5f39c-b96d-48df-a77e-bed36a074a5c"). InnerVolumeSpecName "kube-api-access-g787l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:05:04.256305 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.256219 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fb5f39c-b96d-48df-a77e-bed36a074a5c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0fb5f39c-b96d-48df-a77e-bed36a074a5c" (UID: "0fb5f39c-b96d-48df-a77e-bed36a074a5c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:05:04.355767 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.355725 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-fb7532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fb5f39c-b96d-48df-a77e-bed36a074a5c-isvc-primary-fb7532-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:04.355767 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.355762 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g787l\" (UniqueName: \"kubernetes.io/projected/0fb5f39c-b96d-48df-a77e-bed36a074a5c-kube-api-access-g787l\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:04.355930 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.355777 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fb5f39c-b96d-48df-a77e-bed36a074a5c-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:04.511378 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.511299 2562 generic.go:358] "Generic (PLEG): container finished" podID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerID="ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b" exitCode=0 Apr 22 19:05:04.511513 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.511384 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" Apr 22 19:05:04.511513 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.511386 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" event={"ID":"0fb5f39c-b96d-48df-a77e-bed36a074a5c","Type":"ContainerDied","Data":"ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b"} Apr 22 19:05:04.511513 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.511424 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp" event={"ID":"0fb5f39c-b96d-48df-a77e-bed36a074a5c","Type":"ContainerDied","Data":"457f7afbdb390ebf9ace7c266b1d75ac5b4567f1a6a52472e5cdd37aeb5a303f"} Apr 22 19:05:04.511513 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.511441 2562 scope.go:117] "RemoveContainer" containerID="896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047" Apr 22 19:05:04.519748 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.519730 2562 scope.go:117] "RemoveContainer" containerID="ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b" Apr 22 19:05:04.526611 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.526591 2562 scope.go:117] "RemoveContainer" containerID="68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c" Apr 22 19:05:04.531977 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.531955 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp"] Apr 22 19:05:04.533843 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.533824 2562 scope.go:117] "RemoveContainer" containerID="896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047" Apr 22 19:05:04.534231 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:05:04.534098 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047\": container with ID starting with 896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047 not found: ID does not exist" containerID="896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047" Apr 22 19:05:04.534231 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.534131 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047"} err="failed to get container status \"896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047\": rpc error: code = NotFound desc = could not find container \"896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047\": container with ID starting with 896bd6a03d876ec6c9a46ad9c2d06eb6b40973a160049afb92c4d58bc51d0047 not found: ID does not exist" Apr 22 19:05:04.534231 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.534154 2562 scope.go:117] "RemoveContainer" containerID="ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b" Apr 22 19:05:04.534761 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:05:04.534738 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b\": container with ID starting with ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b not found: ID does not exist" containerID="ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b" Apr 22 19:05:04.534896 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.534768 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b"} err="failed to get container status \"ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b\": rpc error: code = NotFound desc = could not find container \"ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b\": container with ID starting with ec2d37fda211a206fb52c004f28ff637666b9c414bfc5112af8832dee784db1b not found: ID does not exist" Apr 22 19:05:04.534896 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.534790 2562 scope.go:117] "RemoveContainer" containerID="68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c" Apr 22 19:05:04.535177 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:05:04.535153 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c\": container with ID starting with 68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c not found: ID does not exist" containerID="68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c" Apr 22 19:05:04.535246 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.535185 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c"} err="failed to get container status \"68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c\": rpc error: code = NotFound desc = could not find container \"68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c\": container with ID starting with 68c0f9b99f4ccdc63d18ae7830f169153b8687b02b734f6e66f51c1d7272601c not found: ID does not exist" Apr 22 19:05:04.536504 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.536482 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-fb7532-predictor-8c9dccf8b-8dbzp"] Apr 22 19:05:04.840829 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:04.840758 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" path="/var/lib/kubelet/pods/0fb5f39c-b96d-48df-a77e-bed36a074a5c/volumes" Apr 22 19:05:06.519746 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:06.519720 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6_c36effd3-460e-4a67-bce0-d94829ce0106/storage-initializer/0.log" Apr 22 19:05:06.520111 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:06.519755 2562 generic.go:358] "Generic (PLEG): container finished" podID="c36effd3-460e-4a67-bce0-d94829ce0106" containerID="ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291" exitCode=1 Apr 22 19:05:06.520111 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:06.519823 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" event={"ID":"c36effd3-460e-4a67-bce0-d94829ce0106","Type":"ContainerDied","Data":"ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291"} Apr 22 19:05:07.524834 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:07.524809 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6_c36effd3-460e-4a67-bce0-d94829ce0106/storage-initializer/0.log" Apr 22 19:05:07.525243 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:07.524887 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" event={"ID":"c36effd3-460e-4a67-bce0-d94829ce0106","Type":"ContainerStarted","Data":"b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c"} Apr 22 19:05:09.908079 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:09.908046 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6"] Apr 22 19:05:09.908462 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:09.908297 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" podUID="c36effd3-460e-4a67-bce0-d94829ce0106" containerName="storage-initializer" containerID="cri-o://b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c" gracePeriod=30 Apr 22 19:05:10.038106 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038076 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx"] Apr 22 19:05:10.038424 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038412 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kube-rbac-proxy" Apr 22 19:05:10.038485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038426 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kube-rbac-proxy" Apr 22 19:05:10.038485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038435 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c939f87a-cc37-4796-b91c-115915090815" containerName="storage-initializer" Apr 22 19:05:10.038485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038441 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c939f87a-cc37-4796-b91c-115915090815" containerName="storage-initializer" Apr 22 19:05:10.038485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038453 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c939f87a-cc37-4796-b91c-115915090815" containerName="storage-initializer" Apr 22 19:05:10.038485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038459 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c939f87a-cc37-4796-b91c-115915090815" containerName="storage-initializer" Apr 22 19:05:10.038485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038469 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" Apr 22 19:05:10.038485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038473 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" Apr 22 19:05:10.038485 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038485 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="storage-initializer" Apr 22 19:05:10.038748 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038490 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="storage-initializer" Apr 22 19:05:10.038748 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038545 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kserve-container" Apr 22 19:05:10.038748 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038560 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c939f87a-cc37-4796-b91c-115915090815" containerName="storage-initializer" Apr 22 19:05:10.038748 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038567 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fb5f39c-b96d-48df-a77e-bed36a074a5c" containerName="kube-rbac-proxy" Apr 22 19:05:10.038748 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.038573 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c939f87a-cc37-4796-b91c-115915090815" containerName="storage-initializer" Apr 22 19:05:10.041770 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.041754 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.044465 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.044441 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-j46rb\"" Apr 22 19:05:10.044465 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.044455 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-e1926-predictor-serving-cert\"" Apr 22 19:05:10.044621 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.044466 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-e1926-kube-rbac-proxy-sar-config\"" Apr 22 19:05:10.050333 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.050306 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx"] Apr 22 19:05:10.204193 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.204165 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb46a12-262e-4897-9c80-96b947d2f091-kserve-provision-location\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.204349 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.204209 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrl4r\" (UniqueName: \"kubernetes.io/projected/ceb46a12-262e-4897-9c80-96b947d2f091-kube-api-access-lrl4r\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.204349 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.204320 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceb46a12-262e-4897-9c80-96b947d2f091-proxy-tls\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.204429 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.204351 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-e1926-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ceb46a12-262e-4897-9c80-96b947d2f091-raw-sklearn-e1926-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.304987 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.304958 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceb46a12-262e-4897-9c80-96b947d2f091-proxy-tls\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.304987 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.304994 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-e1926-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ceb46a12-262e-4897-9c80-96b947d2f091-raw-sklearn-e1926-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.305279 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.305051 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb46a12-262e-4897-9c80-96b947d2f091-kserve-provision-location\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.305279 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.305084 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrl4r\" (UniqueName: \"kubernetes.io/projected/ceb46a12-262e-4897-9c80-96b947d2f091-kube-api-access-lrl4r\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.305457 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.305434 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb46a12-262e-4897-9c80-96b947d2f091-kserve-provision-location\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.305667 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.305648 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-e1926-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ceb46a12-262e-4897-9c80-96b947d2f091-raw-sklearn-e1926-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.307507 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.307484 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceb46a12-262e-4897-9c80-96b947d2f091-proxy-tls\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.312662 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.312630 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrl4r\" (UniqueName: \"kubernetes.io/projected/ceb46a12-262e-4897-9c80-96b947d2f091-kube-api-access-lrl4r\") pod \"raw-sklearn-e1926-predictor-7fd597f889-kgzmx\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.352663 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.352637 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:10.469715 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.469684 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx"] Apr 22 19:05:10.471537 ip-10-0-128-208 kubenswrapper[2562]: W0422 19:05:10.471432 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb46a12_262e_4897_9c80_96b947d2f091.slice/crio-f6c67637f53db172a4ab63c8d520c7f4ad798240790ab4889ac96d3ccb4f9d3a WatchSource:0}: Error finding container f6c67637f53db172a4ab63c8d520c7f4ad798240790ab4889ac96d3ccb4f9d3a: Status 404 returned error can't find the container with id f6c67637f53db172a4ab63c8d520c7f4ad798240790ab4889ac96d3ccb4f9d3a Apr 22 19:05:10.534665 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:10.534635 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" event={"ID":"ceb46a12-262e-4897-9c80-96b947d2f091","Type":"ContainerStarted","Data":"f6c67637f53db172a4ab63c8d520c7f4ad798240790ab4889ac96d3ccb4f9d3a"} Apr 22 19:05:11.539332 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.539300 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" event={"ID":"ceb46a12-262e-4897-9c80-96b947d2f091","Type":"ContainerStarted","Data":"63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395"} Apr 22 19:05:11.847996 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.847975 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6_c36effd3-460e-4a67-bce0-d94829ce0106/storage-initializer/1.log" Apr 22 19:05:11.848346 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.848331 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6_c36effd3-460e-4a67-bce0-d94829ce0106/storage-initializer/0.log" Apr 22 19:05:11.848408 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.848396 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:11.918995 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.918971 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h2pk\" (UniqueName: \"kubernetes.io/projected/c36effd3-460e-4a67-bce0-d94829ce0106-kube-api-access-8h2pk\") pod \"c36effd3-460e-4a67-bce0-d94829ce0106\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " Apr 22 19:05:11.919132 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.919041 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c36effd3-460e-4a67-bce0-d94829ce0106-proxy-tls\") pod \"c36effd3-460e-4a67-bce0-d94829ce0106\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " Apr 22 19:05:11.919231 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.919218 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36effd3-460e-4a67-bce0-d94829ce0106-kserve-provision-location\") pod \"c36effd3-460e-4a67-bce0-d94829ce0106\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " Apr 22 19:05:11.919322 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.919248 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\") pod \"c36effd3-460e-4a67-bce0-d94829ce0106\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " Apr 22 19:05:11.919322 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.919297 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-cabundle-cert\") pod \"c36effd3-460e-4a67-bce0-d94829ce0106\" (UID: \"c36effd3-460e-4a67-bce0-d94829ce0106\") " Apr 22 19:05:11.919529 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.919505 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c36effd3-460e-4a67-bce0-d94829ce0106-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c36effd3-460e-4a67-bce0-d94829ce0106" (UID: "c36effd3-460e-4a67-bce0-d94829ce0106"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:05:11.919598 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.919575 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c36effd3-460e-4a67-bce0-d94829ce0106-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:11.919664 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.919637 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-isvc-init-fail-afbf81-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-afbf81-kube-rbac-proxy-sar-config") pod "c36effd3-460e-4a67-bce0-d94829ce0106" (UID: "c36effd3-460e-4a67-bce0-d94829ce0106"). InnerVolumeSpecName "isvc-init-fail-afbf81-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:05:11.919779 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.919753 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c36effd3-460e-4a67-bce0-d94829ce0106" (UID: "c36effd3-460e-4a67-bce0-d94829ce0106"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:05:11.921298 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.921270 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c36effd3-460e-4a67-bce0-d94829ce0106-kube-api-access-8h2pk" (OuterVolumeSpecName: "kube-api-access-8h2pk") pod "c36effd3-460e-4a67-bce0-d94829ce0106" (UID: "c36effd3-460e-4a67-bce0-d94829ce0106"). InnerVolumeSpecName "kube-api-access-8h2pk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:05:11.921385 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:11.921321 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c36effd3-460e-4a67-bce0-d94829ce0106-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c36effd3-460e-4a67-bce0-d94829ce0106" (UID: "c36effd3-460e-4a67-bce0-d94829ce0106"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:05:12.020878 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.020852 2562 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-isvc-init-fail-afbf81-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:12.020878 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.020877 2562 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c36effd3-460e-4a67-bce0-d94829ce0106-cabundle-cert\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:12.020999 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.020888 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8h2pk\" (UniqueName: \"kubernetes.io/projected/c36effd3-460e-4a67-bce0-d94829ce0106-kube-api-access-8h2pk\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:12.020999 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.020898 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c36effd3-460e-4a67-bce0-d94829ce0106-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:05:12.543500 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.543474 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6_c36effd3-460e-4a67-bce0-d94829ce0106/storage-initializer/1.log" Apr 22 19:05:12.543905 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.543862 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6_c36effd3-460e-4a67-bce0-d94829ce0106/storage-initializer/0.log" Apr 22 19:05:12.543905 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.543894 2562 generic.go:358] "Generic (PLEG): container finished" podID="c36effd3-460e-4a67-bce0-d94829ce0106" containerID="b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c" exitCode=1 Apr 22 19:05:12.543980 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.543925 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" event={"ID":"c36effd3-460e-4a67-bce0-d94829ce0106","Type":"ContainerDied","Data":"b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c"} Apr 22 19:05:12.543980 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.543970 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" event={"ID":"c36effd3-460e-4a67-bce0-d94829ce0106","Type":"ContainerDied","Data":"b98a09ed692eee33cdfb1befda7bdeb1b35ffa4c7cc73b0292af78da9f203634"} Apr 22 19:05:12.544071 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.543998 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6" Apr 22 19:05:12.544071 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.543992 2562 scope.go:117] "RemoveContainer" containerID="b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c" Apr 22 19:05:12.553230 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.553010 2562 scope.go:117] "RemoveContainer" containerID="ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291" Apr 22 19:05:12.560165 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.560147 2562 scope.go:117] "RemoveContainer" containerID="b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c" Apr 22 19:05:12.560394 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:05:12.560375 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c\": container with ID starting with b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c not found: ID does not exist" containerID="b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c" Apr 22 19:05:12.560474 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.560407 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c"} err="failed to get container status \"b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c\": rpc error: code = NotFound desc = could not find container \"b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c\": container with ID starting with b174fb01017973d8af517576da55680550738c901bde8d25e2f83ac0afac296c not found: ID does not exist" Apr 22 19:05:12.560474 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.560452 2562 scope.go:117] "RemoveContainer" containerID="ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291" Apr 22 19:05:12.560705 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:05:12.560687 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291\": container with ID starting with ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291 not found: ID does not exist" containerID="ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291" Apr 22 19:05:12.560778 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.560711 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291"} err="failed to get container status \"ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291\": rpc error: code = NotFound desc = could not find container \"ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291\": container with ID starting with ee75f5128e20afd93ae411464ab8141d5a2dbc0553a0bcf4c741abc8bbe82291 not found: ID does not exist" Apr 22 19:05:12.588956 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.588901 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6"] Apr 22 19:05:12.590332 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.590305 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-afbf81-predictor-6c759f6f4c-m9fb6"] Apr 22 19:05:12.841786 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:12.841715 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c36effd3-460e-4a67-bce0-d94829ce0106" path="/var/lib/kubelet/pods/c36effd3-460e-4a67-bce0-d94829ce0106/volumes" Apr 22 19:05:14.551836 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:14.551760 2562 generic.go:358] "Generic (PLEG): container finished" podID="ceb46a12-262e-4897-9c80-96b947d2f091" containerID="63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395" exitCode=0 Apr 22 19:05:14.551836 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:14.551815 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" event={"ID":"ceb46a12-262e-4897-9c80-96b947d2f091","Type":"ContainerDied","Data":"63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395"} Apr 22 19:05:15.556315 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:15.556278 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" event={"ID":"ceb46a12-262e-4897-9c80-96b947d2f091","Type":"ContainerStarted","Data":"ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be"} Apr 22 19:05:15.556315 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:15.556321 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" event={"ID":"ceb46a12-262e-4897-9c80-96b947d2f091","Type":"ContainerStarted","Data":"33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688"} Apr 22 19:05:15.556726 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:15.556546 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:15.576695 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:15.576654 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podStartSLOduration=5.576640283 podStartE2EDuration="5.576640283s" podCreationTimestamp="2026-04-22 19:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:05:15.574840564 +0000 UTC m=+1127.254066882" watchObservedRunningTime="2026-04-22 19:05:15.576640283 +0000 UTC m=+1127.255866575" Apr 22 19:05:16.559508 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:16.559478 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:16.560990 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:16.560965 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 19:05:17.563169 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:17.563136 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 19:05:22.567639 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:22.567611 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:05:22.568152 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:22.568127 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 19:05:32.568868 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:32.568835 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 19:05:42.569134 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:42.569046 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 19:05:52.568868 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:05:52.568832 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 19:06:02.568470 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:02.568432 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 19:06:12.568259 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:12.568219 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 19:06:22.569228 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:22.569194 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:06:30.148624 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.148588 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx"] Apr 22 19:06:30.149128 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.148911 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" containerID="cri-o://33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688" gracePeriod=30 Apr 22 19:06:30.149128 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.148935 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kube-rbac-proxy" containerID="cri-o://ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be" gracePeriod=30 Apr 22 19:06:30.208413 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.208385 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq"] Apr 22 19:06:30.208799 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.208782 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c36effd3-460e-4a67-bce0-d94829ce0106" containerName="storage-initializer" Apr 22 19:06:30.208873 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.208803 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36effd3-460e-4a67-bce0-d94829ce0106" containerName="storage-initializer" Apr 22 19:06:30.208873 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.208837 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c36effd3-460e-4a67-bce0-d94829ce0106" containerName="storage-initializer" Apr 22 19:06:30.208873 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.208847 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c36effd3-460e-4a67-bce0-d94829ce0106" containerName="storage-initializer" Apr 22 19:06:30.208970 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.208932 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c36effd3-460e-4a67-bce0-d94829ce0106" containerName="storage-initializer" Apr 22 19:06:30.208970 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.208949 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c36effd3-460e-4a67-bce0-d94829ce0106" containerName="storage-initializer" Apr 22 19:06:30.212294 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.212275 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.214518 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.214501 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-5e900-predictor-serving-cert\"" Apr 22 19:06:30.214598 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.214545 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\"" Apr 22 19:06:30.221418 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.221394 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq"] Apr 22 19:06:30.247487 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.247461 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kserve-provision-location\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.247598 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.247505 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.247666 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.247596 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-proxy-tls\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.247666 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.247641 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmxxj\" (UniqueName: \"kubernetes.io/projected/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kube-api-access-kmxxj\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.348811 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.348778 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kserve-provision-location\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.348972 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.348825 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.348972 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.348889 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-proxy-tls\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.348972 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.348935 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmxxj\" (UniqueName: \"kubernetes.io/projected/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kube-api-access-kmxxj\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.349180 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:06:30.349098 2562 secret.go:189] Couldn't get secret kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-serving-cert: secret "raw-sklearn-runtime-5e900-predictor-serving-cert" not found Apr 22 19:06:30.349180 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:06:30.349172 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-proxy-tls podName:7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:30.849151918 +0000 UTC m=+1202.528378188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-proxy-tls") pod "raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" (UID: "7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58") : secret "raw-sklearn-runtime-5e900-predictor-serving-cert" not found Apr 22 19:06:30.349303 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.349227 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kserve-provision-location\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.349514 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.349494 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.357881 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.357858 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmxxj\" (UniqueName: \"kubernetes.io/projected/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kube-api-access-kmxxj\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.791527 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.791493 2562 generic.go:358] "Generic (PLEG): container finished" podID="ceb46a12-262e-4897-9c80-96b947d2f091" containerID="ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be" exitCode=2 Apr 22 19:06:30.791693 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.791571 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" event={"ID":"ceb46a12-262e-4897-9c80-96b947d2f091","Type":"ContainerDied","Data":"ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be"} Apr 22 19:06:30.853253 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.853224 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-proxy-tls\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:30.855644 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:30.855616 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-proxy-tls\") pod \"raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:31.122727 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:31.122624 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:31.244691 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:31.244551 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq"] Apr 22 19:06:31.247046 ip-10-0-128-208 kubenswrapper[2562]: W0422 19:06:31.246998 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c62d9f3_8aaa_44a5_8e53_6cc574eb8b58.slice/crio-c0caba0377da4ef31ab52564f9cd3069cee755f52b330dccc62a44d17011757d WatchSource:0}: Error finding container c0caba0377da4ef31ab52564f9cd3069cee755f52b330dccc62a44d17011757d: Status 404 returned error can't find the container with id c0caba0377da4ef31ab52564f9cd3069cee755f52b330dccc62a44d17011757d Apr 22 19:06:31.796322 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:31.796282 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" event={"ID":"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58","Type":"ContainerStarted","Data":"07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40"} Apr 22 19:06:31.796496 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:31.796328 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" event={"ID":"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58","Type":"ContainerStarted","Data":"c0caba0377da4ef31ab52564f9cd3069cee755f52b330dccc62a44d17011757d"} Apr 22 19:06:32.564132 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:32.564086 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.33:8643/healthz\": dial tcp 10.134.0.33:8643: connect: connection refused" Apr 22 19:06:32.568348 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:32.568322 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.33:8080: connect: connection refused" Apr 22 19:06:34.096861 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.096834 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:06:34.190151 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.190079 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrl4r\" (UniqueName: \"kubernetes.io/projected/ceb46a12-262e-4897-9c80-96b947d2f091-kube-api-access-lrl4r\") pod \"ceb46a12-262e-4897-9c80-96b947d2f091\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " Apr 22 19:06:34.190338 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.190319 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb46a12-262e-4897-9c80-96b947d2f091-kserve-provision-location\") pod \"ceb46a12-262e-4897-9c80-96b947d2f091\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " Apr 22 19:06:34.190405 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.190389 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-e1926-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ceb46a12-262e-4897-9c80-96b947d2f091-raw-sklearn-e1926-kube-rbac-proxy-sar-config\") pod \"ceb46a12-262e-4897-9c80-96b947d2f091\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " Apr 22 19:06:34.190483 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.190460 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceb46a12-262e-4897-9c80-96b947d2f091-proxy-tls\") pod \"ceb46a12-262e-4897-9c80-96b947d2f091\" (UID: \"ceb46a12-262e-4897-9c80-96b947d2f091\") " Apr 22 19:06:34.190607 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.190586 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb46a12-262e-4897-9c80-96b947d2f091-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ceb46a12-262e-4897-9c80-96b947d2f091" (UID: "ceb46a12-262e-4897-9c80-96b947d2f091"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:06:34.190735 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.190718 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ceb46a12-262e-4897-9c80-96b947d2f091-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:06:34.190735 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.190717 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb46a12-262e-4897-9c80-96b947d2f091-raw-sklearn-e1926-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-e1926-kube-rbac-proxy-sar-config") pod "ceb46a12-262e-4897-9c80-96b947d2f091" (UID: "ceb46a12-262e-4897-9c80-96b947d2f091"). InnerVolumeSpecName "raw-sklearn-e1926-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:06:34.192463 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.192438 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb46a12-262e-4897-9c80-96b947d2f091-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ceb46a12-262e-4897-9c80-96b947d2f091" (UID: "ceb46a12-262e-4897-9c80-96b947d2f091"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:34.192728 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.192707 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb46a12-262e-4897-9c80-96b947d2f091-kube-api-access-lrl4r" (OuterVolumeSpecName: "kube-api-access-lrl4r") pod "ceb46a12-262e-4897-9c80-96b947d2f091" (UID: "ceb46a12-262e-4897-9c80-96b947d2f091"). InnerVolumeSpecName "kube-api-access-lrl4r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:34.291256 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.291229 2562 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-e1926-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ceb46a12-262e-4897-9c80-96b947d2f091-raw-sklearn-e1926-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:06:34.291256 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.291253 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ceb46a12-262e-4897-9c80-96b947d2f091-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:06:34.291256 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.291264 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrl4r\" (UniqueName: \"kubernetes.io/projected/ceb46a12-262e-4897-9c80-96b947d2f091-kube-api-access-lrl4r\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:06:34.806853 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.806819 2562 generic.go:358] "Generic (PLEG): container finished" podID="ceb46a12-262e-4897-9c80-96b947d2f091" containerID="33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688" exitCode=0 Apr 22 19:06:34.807070 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.806904 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" event={"ID":"ceb46a12-262e-4897-9c80-96b947d2f091","Type":"ContainerDied","Data":"33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688"} Apr 22 19:06:34.807070 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.806936 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" Apr 22 19:06:34.807070 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.806946 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx" event={"ID":"ceb46a12-262e-4897-9c80-96b947d2f091","Type":"ContainerDied","Data":"f6c67637f53db172a4ab63c8d520c7f4ad798240790ab4889ac96d3ccb4f9d3a"} Apr 22 19:06:34.807070 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.806963 2562 scope.go:117] "RemoveContainer" containerID="ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be" Apr 22 19:06:34.815735 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.815718 2562 scope.go:117] "RemoveContainer" containerID="33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688" Apr 22 19:06:34.822562 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.822544 2562 scope.go:117] "RemoveContainer" containerID="63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395" Apr 22 19:06:34.828299 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.828277 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx"] Apr 22 19:06:34.831393 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.831100 2562 scope.go:117] "RemoveContainer" containerID="ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be" Apr 22 19:06:34.832956 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.832937 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-e1926-predictor-7fd597f889-kgzmx"] Apr 22 19:06:34.833063 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:06:34.833041 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be\": container with ID starting with ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be not found: ID does not exist" containerID="ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be" Apr 22 19:06:34.833121 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.833062 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be"} err="failed to get container status \"ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be\": rpc error: code = NotFound desc = could not find container \"ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be\": container with ID starting with ada48fc1460ccbde36af5c018c94d5f608c4c9d63b1169460338d1e6ba6ed7be not found: ID does not exist" Apr 22 19:06:34.833121 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.833078 2562 scope.go:117] "RemoveContainer" containerID="33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688" Apr 22 19:06:34.833374 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:06:34.833356 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688\": container with ID starting with 33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688 not found: ID does not exist" containerID="33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688" Apr 22 19:06:34.833430 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.833376 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688"} err="failed to get container status \"33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688\": rpc error: code = NotFound desc = could not find container \"33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688\": container with ID starting with 33a204315309f8cfd483d8ca5934f2097741bf276a83cb39d9a64d45997d9688 not found: ID does not exist" Apr 22 19:06:34.833430 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.833399 2562 scope.go:117] "RemoveContainer" containerID="63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395" Apr 22 19:06:34.833636 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:06:34.833621 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395\": container with ID starting with 63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395 not found: ID does not exist" containerID="63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395" Apr 22 19:06:34.833688 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.833640 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395"} err="failed to get container status \"63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395\": rpc error: code = NotFound desc = could not find container \"63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395\": container with ID starting with 63fb0588e5c5981f0fb2442c0ba4bf23501c1fd65d922128036af29c6691f395 not found: ID does not exist" Apr 22 19:06:34.840750 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:34.840721 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" path="/var/lib/kubelet/pods/ceb46a12-262e-4897-9c80-96b947d2f091/volumes" Apr 22 19:06:35.811164 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:35.811131 2562 generic.go:358] "Generic (PLEG): container finished" podID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerID="07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40" exitCode=0 Apr 22 19:06:35.811565 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:35.811207 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" event={"ID":"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58","Type":"ContainerDied","Data":"07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40"} Apr 22 19:06:36.816912 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:36.816877 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" event={"ID":"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58","Type":"ContainerStarted","Data":"89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab"} Apr 22 19:06:36.816912 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:36.816913 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" event={"ID":"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58","Type":"ContainerStarted","Data":"ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33"} Apr 22 19:06:36.817443 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:36.817228 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:36.817443 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:36.817367 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:36.818626 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:36.818598 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:06:36.836523 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:36.836473 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podStartSLOduration=6.8364612959999995 podStartE2EDuration="6.836461296s" podCreationTimestamp="2026-04-22 19:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:36.834058276 +0000 UTC m=+1208.513284565" watchObservedRunningTime="2026-04-22 19:06:36.836461296 +0000 UTC m=+1208.515687587" Apr 22 19:06:37.820163 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:37.820134 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:06:42.825125 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:42.825099 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:06:42.825692 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:42.825667 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:06:52.825892 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:06:52.825855 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:07:02.826232 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:02.826197 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:07:12.826036 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:12.825944 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:07:22.825746 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:22.825707 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:07:32.826295 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:32.826256 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:07:42.826178 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:42.826152 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:07:50.297981 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:50.297943 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq"] Apr 22 19:07:50.298410 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:50.298277 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" containerID="cri-o://ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33" gracePeriod=30 Apr 22 19:07:50.298410 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:50.298319 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kube-rbac-proxy" containerID="cri-o://89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab" gracePeriod=30 Apr 22 19:07:51.033512 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.033479 2562 generic.go:358] "Generic (PLEG): container finished" podID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerID="89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab" exitCode=2 Apr 22 19:07:51.033512 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.033515 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" event={"ID":"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58","Type":"ContainerDied","Data":"89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab"} Apr 22 19:07:51.360616 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.360540 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7g8kz/must-gather-qcx9b"] Apr 22 19:07:51.360951 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.360857 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" Apr 22 19:07:51.360951 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.360868 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" Apr 22 19:07:51.360951 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.360882 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="storage-initializer" Apr 22 19:07:51.360951 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.360888 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="storage-initializer" Apr 22 19:07:51.360951 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.360904 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kube-rbac-proxy" Apr 22 19:07:51.360951 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.360909 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kube-rbac-proxy" Apr 22 19:07:51.361174 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.360959 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kserve-container" Apr 22 19:07:51.361174 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.360972 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="ceb46a12-262e-4897-9c80-96b947d2f091" containerName="kube-rbac-proxy" Apr 22 19:07:51.363975 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.363959 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:07:51.366291 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.366271 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7g8kz\"/\"openshift-service-ca.crt\"" Apr 22 19:07:51.366399 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.366327 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7g8kz\"/\"kube-root-ca.crt\"" Apr 22 19:07:51.366399 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.366355 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7g8kz\"/\"default-dockercfg-wkh4t\"" Apr 22 19:07:51.371131 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.371111 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7g8kz/must-gather-qcx9b"] Apr 22 19:07:51.500150 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.500120 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-must-gather-output\") pod \"must-gather-qcx9b\" (UID: \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\") " pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:07:51.500315 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.500207 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtmn7\" (UniqueName: \"kubernetes.io/projected/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-kube-api-access-mtmn7\") pod \"must-gather-qcx9b\" (UID: \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\") " pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:07:51.600525 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.600500 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtmn7\" (UniqueName: \"kubernetes.io/projected/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-kube-api-access-mtmn7\") pod \"must-gather-qcx9b\" (UID: \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\") " pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:07:51.600684 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.600549 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-must-gather-output\") pod \"must-gather-qcx9b\" (UID: \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\") " pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:07:51.600853 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.600826 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-must-gather-output\") pod \"must-gather-qcx9b\" (UID: \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\") " pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:07:51.609184 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.609161 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtmn7\" (UniqueName: \"kubernetes.io/projected/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-kube-api-access-mtmn7\") pod \"must-gather-qcx9b\" (UID: \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\") " pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:07:51.684631 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.684574 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:07:51.797839 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:51.797814 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7g8kz/must-gather-qcx9b"] Apr 22 19:07:51.800286 ip-10-0-128-208 kubenswrapper[2562]: W0422 19:07:51.800262 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5923c8cb_4ee7_49fd_8a46_c6ead6c4b7db.slice/crio-c84eb758a1b82a80e84d2dda9ffc486deaee92f7e52a1dd554e1674d24cb011d WatchSource:0}: Error finding container c84eb758a1b82a80e84d2dda9ffc486deaee92f7e52a1dd554e1674d24cb011d: Status 404 returned error can't find the container with id c84eb758a1b82a80e84d2dda9ffc486deaee92f7e52a1dd554e1674d24cb011d Apr 22 19:07:52.038446 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:52.038410 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" event={"ID":"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db","Type":"ContainerStarted","Data":"c84eb758a1b82a80e84d2dda9ffc486deaee92f7e52a1dd554e1674d24cb011d"} Apr 22 19:07:52.821209 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:52.821164 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.34:8643/healthz\": dial tcp 10.134.0.34:8643: connect: connection refused" Apr 22 19:07:52.826612 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:52.826580 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.34:8080: connect: connection refused" Apr 22 19:07:55.451438 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.451413 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:07:55.535533 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.535459 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\") pod \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " Apr 22 19:07:55.535533 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.535508 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kserve-provision-location\") pod \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " Apr 22 19:07:55.535768 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.535546 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmxxj\" (UniqueName: \"kubernetes.io/projected/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kube-api-access-kmxxj\") pod \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " Apr 22 19:07:55.535768 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.535597 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-proxy-tls\") pod \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\" (UID: \"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58\") " Apr 22 19:07:55.535930 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.535880 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config") pod "7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" (UID: "7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58"). InnerVolumeSpecName "raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:07:55.535930 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.535916 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" (UID: "7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:55.538416 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.538380 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kube-api-access-kmxxj" (OuterVolumeSpecName: "kube-api-access-kmxxj") pod "7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" (UID: "7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58"). InnerVolumeSpecName "kube-api-access-kmxxj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:55.538531 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.538447 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" (UID: "7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:55.637155 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.637124 2562 reconciler_common.go:299] "Volume detached for volume \"raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-raw-sklearn-runtime-5e900-kube-rbac-proxy-sar-config\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:07:55.637155 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.637155 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kserve-provision-location\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:07:55.637354 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.637172 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmxxj\" (UniqueName: \"kubernetes.io/projected/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-kube-api-access-kmxxj\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:07:55.637354 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:55.637187 2562 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58-proxy-tls\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:07:56.052627 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.052577 2562 generic.go:358] "Generic (PLEG): container finished" podID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerID="ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33" exitCode=0 Apr 22 19:07:56.052802 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.052646 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" event={"ID":"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58","Type":"ContainerDied","Data":"ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33"} Apr 22 19:07:56.052802 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.052678 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" Apr 22 19:07:56.052802 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.052696 2562 scope.go:117] "RemoveContainer" containerID="89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab" Apr 22 19:07:56.052984 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.052683 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq" event={"ID":"7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58","Type":"ContainerDied","Data":"c0caba0377da4ef31ab52564f9cd3069cee755f52b330dccc62a44d17011757d"} Apr 22 19:07:56.077313 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.077283 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq"] Apr 22 19:07:56.082070 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.082045 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-5e900-predictor-7dd89bd74c-gvgwq"] Apr 22 19:07:56.371494 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.371339 2562 scope.go:117] "RemoveContainer" containerID="ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33" Apr 22 19:07:56.378528 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.378511 2562 scope.go:117] "RemoveContainer" containerID="07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40" Apr 22 19:07:56.384989 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.384972 2562 scope.go:117] "RemoveContainer" containerID="89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab" Apr 22 19:07:56.385281 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:07:56.385264 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab\": container with ID starting with 89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab not found: ID does not exist" containerID="89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab" Apr 22 19:07:56.385344 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.385289 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab"} err="failed to get container status \"89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab\": rpc error: code = NotFound desc = could not find container \"89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab\": container with ID starting with 89eeb11dce4075f463f29ab95924bd9c0253a6251d2a2c00356d5fd2ea1432ab not found: ID does not exist" Apr 22 19:07:56.385344 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.385308 2562 scope.go:117] "RemoveContainer" containerID="ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33" Apr 22 19:07:56.385558 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:07:56.385542 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33\": container with ID starting with ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33 not found: ID does not exist" containerID="ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33" Apr 22 19:07:56.385606 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.385565 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33"} err="failed to get container status \"ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33\": rpc error: code = NotFound desc = could not find container \"ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33\": container with ID starting with ee53ffb9e6d743a1c827fd60638fff8ac2bd9357049210b8664588266aabfa33 not found: ID does not exist" Apr 22 19:07:56.385606 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.385581 2562 scope.go:117] "RemoveContainer" containerID="07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40" Apr 22 19:07:56.385809 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:07:56.385790 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40\": container with ID starting with 07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40 not found: ID does not exist" containerID="07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40" Apr 22 19:07:56.385864 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.385818 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40"} err="failed to get container status \"07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40\": rpc error: code = NotFound desc = could not find container \"07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40\": container with ID starting with 07a6275d8c9ff89d68e2e7ce4f4744a386dfe36261459df505c0283287b21e40 not found: ID does not exist" Apr 22 19:07:56.842192 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:56.842160 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" path="/var/lib/kubelet/pods/7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58/volumes" Apr 22 19:07:57.058611 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:57.058575 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" event={"ID":"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db","Type":"ContainerStarted","Data":"7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631"} Apr 22 19:07:57.058611 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:57.058610 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" event={"ID":"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db","Type":"ContainerStarted","Data":"7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d"} Apr 22 19:07:57.075269 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:07:57.075204 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" podStartSLOduration=1.445138614 podStartE2EDuration="6.075185649s" podCreationTimestamp="2026-04-22 19:07:51 +0000 UTC" firstStartedPulling="2026-04-22 19:07:51.802304067 +0000 UTC m=+1283.481530337" lastFinishedPulling="2026-04-22 19:07:56.432351098 +0000 UTC m=+1288.111577372" observedRunningTime="2026-04-22 19:07:57.072489602 +0000 UTC m=+1288.751715893" watchObservedRunningTime="2026-04-22 19:07:57.075185649 +0000 UTC m=+1288.754411942" Apr 22 19:08:14.112219 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:14.112180 2562 generic.go:358] "Generic (PLEG): container finished" podID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" containerID="7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d" exitCode=0 Apr 22 19:08:14.112219 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:14.112220 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" event={"ID":"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db","Type":"ContainerDied","Data":"7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d"} Apr 22 19:08:14.112664 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:14.112550 2562 scope.go:117] "RemoveContainer" containerID="7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d" Apr 22 19:08:14.915010 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:14.914980 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7g8kz_must-gather-qcx9b_5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db/gather/0.log" Apr 22 19:08:15.448847 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.448807 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k99vz/must-gather-fchrf"] Apr 22 19:08:15.449245 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.449189 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" Apr 22 19:08:15.449245 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.449202 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" Apr 22 19:08:15.449245 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.449225 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kube-rbac-proxy" Apr 22 19:08:15.449245 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.449234 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kube-rbac-proxy" Apr 22 19:08:15.449381 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.449249 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="storage-initializer" Apr 22 19:08:15.449381 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.449256 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="storage-initializer" Apr 22 19:08:15.449381 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.449323 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kserve-container" Apr 22 19:08:15.449381 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.449334 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c62d9f3-8aaa-44a5-8e53-6cc574eb8b58" containerName="kube-rbac-proxy" Apr 22 19:08:15.451701 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.451686 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k99vz/must-gather-fchrf" Apr 22 19:08:15.455857 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.455838 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k99vz\"/\"kube-root-ca.crt\"" Apr 22 19:08:15.459281 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.459265 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k99vz\"/\"openshift-service-ca.crt\"" Apr 22 19:08:15.461318 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.461305 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-k99vz\"/\"default-dockercfg-q7x9l\"" Apr 22 19:08:15.465931 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.465911 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k99vz/must-gather-fchrf"] Apr 22 19:08:15.614537 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.614499 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3-must-gather-output\") pod \"must-gather-fchrf\" (UID: \"cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3\") " pod="openshift-must-gather-k99vz/must-gather-fchrf" Apr 22 19:08:15.614537 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.614539 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp9ld\" (UniqueName: \"kubernetes.io/projected/cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3-kube-api-access-rp9ld\") pod \"must-gather-fchrf\" (UID: \"cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3\") " pod="openshift-must-gather-k99vz/must-gather-fchrf" Apr 22 19:08:15.715712 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.715602 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3-must-gather-output\") pod \"must-gather-fchrf\" (UID: \"cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3\") " pod="openshift-must-gather-k99vz/must-gather-fchrf" Apr 22 19:08:15.715712 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.715656 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp9ld\" (UniqueName: \"kubernetes.io/projected/cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3-kube-api-access-rp9ld\") pod \"must-gather-fchrf\" (UID: \"cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3\") " pod="openshift-must-gather-k99vz/must-gather-fchrf" Apr 22 19:08:15.716063 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.716003 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3-must-gather-output\") pod \"must-gather-fchrf\" (UID: \"cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3\") " pod="openshift-must-gather-k99vz/must-gather-fchrf" Apr 22 19:08:15.724798 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.724776 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp9ld\" (UniqueName: \"kubernetes.io/projected/cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3-kube-api-access-rp9ld\") pod \"must-gather-fchrf\" (UID: \"cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3\") " pod="openshift-must-gather-k99vz/must-gather-fchrf" Apr 22 19:08:15.760295 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.760260 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k99vz/must-gather-fchrf" Apr 22 19:08:15.883574 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:15.881275 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k99vz/must-gather-fchrf"] Apr 22 19:08:15.884361 ip-10-0-128-208 kubenswrapper[2562]: W0422 19:08:15.884324 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcca8498f_0a4b_4ee2_9d0c_f9a89d9b73d3.slice/crio-d98c39086f71f5b58303138595436188a6ab6263ee0237fa538aabf081c03c3f WatchSource:0}: Error finding container d98c39086f71f5b58303138595436188a6ab6263ee0237fa538aabf081c03c3f: Status 404 returned error can't find the container with id d98c39086f71f5b58303138595436188a6ab6263ee0237fa538aabf081c03c3f Apr 22 19:08:16.119211 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:16.119129 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k99vz/must-gather-fchrf" event={"ID":"cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3","Type":"ContainerStarted","Data":"d98c39086f71f5b58303138595436188a6ab6263ee0237fa538aabf081c03c3f"} Apr 22 19:08:17.125758 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:17.125723 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k99vz/must-gather-fchrf" event={"ID":"cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3","Type":"ContainerStarted","Data":"7d37a7b452fad9d180dac832bbe1e701c3a683617d3c6fa17b4ee2174d115b31"} Apr 22 19:08:17.125758 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:17.125760 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k99vz/must-gather-fchrf" event={"ID":"cca8498f-0a4b-4ee2-9d0c-f9a89d9b73d3","Type":"ContainerStarted","Data":"43ac4cb4a9077a122561ec6948954a3749644075032b7fbe8141c415186a6986"} Apr 22 19:08:17.144375 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:17.144316 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k99vz/must-gather-fchrf" podStartSLOduration=1.252961166 podStartE2EDuration="2.144296641s" podCreationTimestamp="2026-04-22 19:08:15 +0000 UTC" firstStartedPulling="2026-04-22 19:08:15.886454264 +0000 UTC m=+1307.565680537" lastFinishedPulling="2026-04-22 19:08:16.777789739 +0000 UTC m=+1308.457016012" observedRunningTime="2026-04-22 19:08:17.141293872 +0000 UTC m=+1308.820520163" watchObservedRunningTime="2026-04-22 19:08:17.144296641 +0000 UTC m=+1308.823522936" Apr 22 19:08:18.227389 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:18.227359 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-r5vf6_f2bae4e9-5ac7-46d8-beee-9c9a3a81031b/global-pull-secret-syncer/0.log" Apr 22 19:08:18.328046 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:18.327982 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-m7cgq_9bd73bb8-d0cb-40bb-828c-febdb1ce4ac0/konnectivity-agent/0.log" Apr 22 19:08:18.353282 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:18.353245 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-208.ec2.internal_2e0c91486769e79b8cafe9f0bd44a6b4/haproxy/0.log" Apr 22 19:08:20.285333 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.285281 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7g8kz/must-gather-qcx9b"] Apr 22 19:08:20.286169 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.285880 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" podUID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" containerName="copy" containerID="cri-o://7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631" gracePeriod=2 Apr 22 19:08:20.291671 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.291641 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7g8kz/must-gather-qcx9b"] Apr 22 19:08:20.692081 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.691573 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7g8kz_must-gather-qcx9b_5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db/copy/0.log" Apr 22 19:08:20.692081 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.692032 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:08:20.700363 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.696227 2562 status_manager.go:895] "Failed to get status for pod" podUID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" err="pods \"must-gather-qcx9b\" is forbidden: User \"system:node:ip-10-0-128-208.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7g8kz\": no relationship found between node 'ip-10-0-128-208.ec2.internal' and this object" Apr 22 19:08:20.773553 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.773518 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-must-gather-output\") pod \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\" (UID: \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\") " Apr 22 19:08:20.773832 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.773813 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtmn7\" (UniqueName: \"kubernetes.io/projected/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-kube-api-access-mtmn7\") pod \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\" (UID: \"5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db\") " Apr 22 19:08:20.774974 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.774937 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" (UID: "5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:20.786522 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.786477 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-kube-api-access-mtmn7" (OuterVolumeSpecName: "kube-api-access-mtmn7") pod "5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" (UID: "5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db"). InnerVolumeSpecName "kube-api-access-mtmn7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:08:20.846101 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.843625 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" path="/var/lib/kubelet/pods/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db/volumes" Apr 22 19:08:20.875914 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.875857 2562 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-must-gather-output\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:08:20.875914 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:20.875914 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mtmn7\" (UniqueName: \"kubernetes.io/projected/5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db-kube-api-access-mtmn7\") on node \"ip-10-0-128-208.ec2.internal\" DevicePath \"\"" Apr 22 19:08:21.144459 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.144422 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7g8kz_must-gather-qcx9b_5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db/copy/0.log" Apr 22 19:08:21.144853 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.144821 2562 generic.go:358] "Generic (PLEG): container finished" podID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" containerID="7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631" exitCode=143 Apr 22 19:08:21.144931 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.144915 2562 scope.go:117] "RemoveContainer" containerID="7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631" Apr 22 19:08:21.145073 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.145056 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7g8kz/must-gather-qcx9b" Apr 22 19:08:21.161264 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.161230 2562 scope.go:117] "RemoveContainer" containerID="7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d" Apr 22 19:08:21.183498 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.183474 2562 scope.go:117] "RemoveContainer" containerID="7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631" Apr 22 19:08:21.183893 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:08:21.183863 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631\": container with ID starting with 7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631 not found: ID does not exist" containerID="7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631" Apr 22 19:08:21.184031 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.183903 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631"} err="failed to get container status \"7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631\": rpc error: code = NotFound desc = could not find container \"7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631\": container with ID starting with 7954c89eec047d9caddeff348ea254ae7dad9c5e27e86e7ab8b300817f31b631 not found: ID does not exist" Apr 22 19:08:21.184031 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.183931 2562 scope.go:117] "RemoveContainer" containerID="7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d" Apr 22 19:08:21.184259 ip-10-0-128-208 kubenswrapper[2562]: E0422 19:08:21.184224 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d\": container with ID starting with 7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d not found: ID does not exist" containerID="7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d" Apr 22 19:08:21.184344 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.184267 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d"} err="failed to get container status \"7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d\": rpc error: code = NotFound desc = could not find container \"7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d\": container with ID starting with 7d33fac4086c5da77b42ca7f39c9cc1f0d081d6431a6dd128d8638a63e26701d not found: ID does not exist" Apr 22 19:08:21.944300 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.944271 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7913899-f23d-44b1-b927-ea1c19e65898/alertmanager/0.log" Apr 22 19:08:21.971271 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.971240 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7913899-f23d-44b1-b927-ea1c19e65898/config-reloader/0.log" Apr 22 19:08:21.992901 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:21.992875 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7913899-f23d-44b1-b927-ea1c19e65898/kube-rbac-proxy-web/0.log" Apr 22 19:08:22.022575 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.022544 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7913899-f23d-44b1-b927-ea1c19e65898/kube-rbac-proxy/0.log" Apr 22 19:08:22.045524 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.045488 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7913899-f23d-44b1-b927-ea1c19e65898/kube-rbac-proxy-metric/0.log" Apr 22 19:08:22.064867 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.064833 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7913899-f23d-44b1-b927-ea1c19e65898/prom-label-proxy/0.log" Apr 22 19:08:22.086522 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.086489 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7913899-f23d-44b1-b927-ea1c19e65898/init-config-reloader/0.log" Apr 22 19:08:22.155820 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.155784 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mzh9k_2841b4ea-b22d-494c-89ef-c9e8cea71efd/kube-state-metrics/0.log" Apr 22 19:08:22.180152 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.180120 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mzh9k_2841b4ea-b22d-494c-89ef-c9e8cea71efd/kube-rbac-proxy-main/0.log" Apr 22 19:08:22.204129 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.204094 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mzh9k_2841b4ea-b22d-494c-89ef-c9e8cea71efd/kube-rbac-proxy-self/0.log" Apr 22 19:08:22.233544 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.233514 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-864cb4674-xgnfz_7cfd4bf2-8f28-408b-a116-ad7939016998/metrics-server/0.log" Apr 22 19:08:22.256989 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.256959 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4wbp6_c42fbe22-3543-489f-99d7-1f5578ffae18/monitoring-plugin/0.log" Apr 22 19:08:22.347392 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.347347 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jhkhr_74910032-40a3-4178-a85b-c09cd90b2d70/node-exporter/0.log" Apr 22 19:08:22.367289 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.367265 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jhkhr_74910032-40a3-4178-a85b-c09cd90b2d70/kube-rbac-proxy/0.log" Apr 22 19:08:22.389864 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.389833 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jhkhr_74910032-40a3-4178-a85b-c09cd90b2d70/init-textfile/0.log" Apr 22 19:08:22.484592 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.484501 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lz682_78d16b88-530b-4cf5-a6d2-d70097c71800/kube-rbac-proxy-main/0.log" Apr 22 19:08:22.504847 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.504820 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lz682_78d16b88-530b-4cf5-a6d2-d70097c71800/kube-rbac-proxy-self/0.log" Apr 22 19:08:22.526345 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.526315 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-lz682_78d16b88-530b-4cf5-a6d2-d70097c71800/openshift-state-metrics/0.log" Apr 22 19:08:22.558121 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.558085 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f873d8da-279d-4a65-bdef-a40c7f6c6f93/prometheus/0.log" Apr 22 19:08:22.576102 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.576072 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f873d8da-279d-4a65-bdef-a40c7f6c6f93/config-reloader/0.log" Apr 22 19:08:22.596772 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.596740 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f873d8da-279d-4a65-bdef-a40c7f6c6f93/thanos-sidecar/0.log" Apr 22 19:08:22.615900 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.615876 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f873d8da-279d-4a65-bdef-a40c7f6c6f93/kube-rbac-proxy-web/0.log" Apr 22 19:08:22.636556 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.636530 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f873d8da-279d-4a65-bdef-a40c7f6c6f93/kube-rbac-proxy/0.log" Apr 22 19:08:22.656132 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.656093 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f873d8da-279d-4a65-bdef-a40c7f6c6f93/kube-rbac-proxy-thanos/0.log" Apr 22 19:08:22.674826 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.674789 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f873d8da-279d-4a65-bdef-a40c7f6c6f93/init-config-reloader/0.log" Apr 22 19:08:22.771105 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.771001 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-76cf6dcdcf-2tgsb_d922fb92-79c5-4a55-a2b5-321fe55b5381/telemeter-client/0.log" Apr 22 19:08:22.789861 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.789835 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-76cf6dcdcf-2tgsb_d922fb92-79c5-4a55-a2b5-321fe55b5381/reload/0.log" Apr 22 19:08:22.813168 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.813142 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-76cf6dcdcf-2tgsb_d922fb92-79c5-4a55-a2b5-321fe55b5381/kube-rbac-proxy/0.log" Apr 22 19:08:22.842735 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.842670 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f989975c-2xc7s_814b00bd-8708-4647-b738-0041818be65e/thanos-query/0.log" Apr 22 19:08:22.861533 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.861453 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f989975c-2xc7s_814b00bd-8708-4647-b738-0041818be65e/kube-rbac-proxy-web/0.log" Apr 22 19:08:22.884955 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.884925 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f989975c-2xc7s_814b00bd-8708-4647-b738-0041818be65e/kube-rbac-proxy/0.log" Apr 22 19:08:22.915003 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.914979 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f989975c-2xc7s_814b00bd-8708-4647-b738-0041818be65e/prom-label-proxy/0.log" Apr 22 19:08:22.938207 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.938177 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f989975c-2xc7s_814b00bd-8708-4647-b738-0041818be65e/kube-rbac-proxy-rules/0.log" Apr 22 19:08:22.963284 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:22.963250 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f989975c-2xc7s_814b00bd-8708-4647-b738-0041818be65e/kube-rbac-proxy-metrics/0.log" Apr 22 19:08:25.139333 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.139302 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l"] Apr 22 19:08:25.139769 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.139649 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" containerName="gather" Apr 22 19:08:25.139769 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.139661 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" containerName="gather" Apr 22 19:08:25.139769 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.139671 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" containerName="copy" Apr 22 19:08:25.139769 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.139676 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" containerName="copy" Apr 22 19:08:25.139769 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.139750 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" containerName="gather" Apr 22 19:08:25.139769 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.139760 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="5923c8cb-4ee7-49fd-8a46-c6ead6c4b7db" containerName="copy" Apr 22 19:08:25.143722 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.143701 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.153725 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.153698 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l"] Apr 22 19:08:25.221463 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.221417 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rk2k\" (UniqueName: \"kubernetes.io/projected/3e140f02-e900-4d15-9618-e38482f3f59a-kube-api-access-2rk2k\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.221463 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.221460 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-lib-modules\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.221673 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.221485 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-podres\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.221673 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.221510 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-proc\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.221673 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.221587 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-sys\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.322740 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.322686 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-sys\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.322932 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.322779 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rk2k\" (UniqueName: \"kubernetes.io/projected/3e140f02-e900-4d15-9618-e38482f3f59a-kube-api-access-2rk2k\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.322932 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.322816 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-lib-modules\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.322932 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.322820 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-sys\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.322932 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.322846 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-podres\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.322932 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.322872 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-proc\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.323269 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.322979 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-proc\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.323269 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.323141 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-lib-modules\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.323269 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.323223 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3e140f02-e900-4d15-9618-e38482f3f59a-podres\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.331603 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.331573 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rk2k\" (UniqueName: \"kubernetes.io/projected/3e140f02-e900-4d15-9618-e38482f3f59a-kube-api-access-2rk2k\") pod \"perf-node-gather-daemonset-ckd8l\" (UID: \"3e140f02-e900-4d15-9618-e38482f3f59a\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.456656 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.456630 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:25.594886 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.594659 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l"] Apr 22 19:08:25.598461 ip-10-0-128-208 kubenswrapper[2562]: W0422 19:08:25.598434 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3e140f02_e900_4d15_9618_e38482f3f59a.slice/crio-82ccf9781c5101ccb0bf4f2e19f4f4723743093b99ebd25d1188d68d872fee8e WatchSource:0}: Error finding container 82ccf9781c5101ccb0bf4f2e19f4f4723743093b99ebd25d1188d68d872fee8e: Status 404 returned error can't find the container with id 82ccf9781c5101ccb0bf4f2e19f4f4723743093b99ebd25d1188d68d872fee8e Apr 22 19:08:25.600713 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.600133 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:08:25.977481 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.977452 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j7m2r_d2fb4248-956b-4863-9bda-6b409ba13de6/dns/0.log" Apr 22 19:08:25.995616 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:25.995540 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j7m2r_d2fb4248-956b-4863-9bda-6b409ba13de6/kube-rbac-proxy/0.log" Apr 22 19:08:26.100900 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:26.100870 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sjgbr_c99d2e9b-5df0-44c6-9f90-b824537af676/dns-node-resolver/0.log" Apr 22 19:08:26.166715 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:26.166674 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" event={"ID":"3e140f02-e900-4d15-9618-e38482f3f59a","Type":"ContainerStarted","Data":"781310c46caf69e0342753a987224b54d7dc124677bab163fdc2196d6decf813"} Apr 22 19:08:26.167257 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:26.166721 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" event={"ID":"3e140f02-e900-4d15-9618-e38482f3f59a","Type":"ContainerStarted","Data":"82ccf9781c5101ccb0bf4f2e19f4f4723743093b99ebd25d1188d68d872fee8e"} Apr 22 19:08:26.167257 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:26.166835 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:26.182764 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:26.182723 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" podStartSLOduration=1.182708991 podStartE2EDuration="1.182708991s" podCreationTimestamp="2026-04-22 19:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:26.181498375 +0000 UTC m=+1317.860724666" watchObservedRunningTime="2026-04-22 19:08:26.182708991 +0000 UTC m=+1317.861935282" Apr 22 19:08:26.523149 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:26.523120 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qssrm_01634ae5-68d0-4ab8-8b17-7736250c3f31/node-ca/0.log" Apr 22 19:08:27.587787 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:27.587755 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zzrc9_8457890c-9b4b-4d2b-b315-2a7b5aaa060e/serve-healthcheck-canary/0.log" Apr 22 19:08:28.053161 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:28.053136 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kcplt_71884152-334a-4b9f-8f06-c42c443f8518/kube-rbac-proxy/0.log" Apr 22 19:08:28.072200 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:28.072181 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kcplt_71884152-334a-4b9f-8f06-c42c443f8518/exporter/0.log" Apr 22 19:08:28.092220 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:28.092196 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kcplt_71884152-334a-4b9f-8f06-c42c443f8518/extractor/0.log" Apr 22 19:08:30.124066 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:30.124028 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-swv5t_454f1685-874d-406c-8bfc-5d7de31631b7/manager/0.log" Apr 22 19:08:30.140877 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:30.140854 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-lwt56_a510694b-77d7-467b-99d3-07b8d4fd4d86/s3-init/0.log" Apr 22 19:08:30.166805 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:30.166781 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-cqwz5_a4b4aca4-a42d-4da5-bf19-0e0dcca4d385/seaweedfs/0.log" Apr 22 19:08:32.181818 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:32.181792 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-ckd8l" Apr 22 19:08:35.351204 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.351132 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rwjzd_7ff492a4-8166-428f-aa35-d7319e606032/kube-multus-additional-cni-plugins/0.log" Apr 22 19:08:35.370216 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.370183 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rwjzd_7ff492a4-8166-428f-aa35-d7319e606032/egress-router-binary-copy/0.log" Apr 22 19:08:35.387712 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.387689 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rwjzd_7ff492a4-8166-428f-aa35-d7319e606032/cni-plugins/0.log" Apr 22 19:08:35.406335 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.406316 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rwjzd_7ff492a4-8166-428f-aa35-d7319e606032/bond-cni-plugin/0.log" Apr 22 19:08:35.423507 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.423487 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rwjzd_7ff492a4-8166-428f-aa35-d7319e606032/routeoverride-cni/0.log" Apr 22 19:08:35.441410 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.441392 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rwjzd_7ff492a4-8166-428f-aa35-d7319e606032/whereabouts-cni-bincopy/0.log" Apr 22 19:08:35.459062 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.459042 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rwjzd_7ff492a4-8166-428f-aa35-d7319e606032/whereabouts-cni/0.log" Apr 22 19:08:35.522399 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.522369 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j7np9_32fb3eb3-1544-4a71-8e35-ca98066a2f14/kube-multus/0.log" Apr 22 19:08:35.570556 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.570529 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4vbxg_da9bff9a-df34-4fcf-9338-631fbb086e31/network-metrics-daemon/0.log" Apr 22 19:08:35.590113 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:35.590084 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4vbxg_da9bff9a-df34-4fcf-9338-631fbb086e31/kube-rbac-proxy/0.log" Apr 22 19:08:37.018476 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:37.018447 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zf6t2_df350c2f-f320-4324-915f-7ba97845f4cf/ovn-controller/0.log" Apr 22 19:08:37.047141 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:37.047112 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zf6t2_df350c2f-f320-4324-915f-7ba97845f4cf/ovn-acl-logging/0.log" Apr 22 19:08:37.072147 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:37.072123 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zf6t2_df350c2f-f320-4324-915f-7ba97845f4cf/kube-rbac-proxy-node/0.log" Apr 22 19:08:37.089894 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:37.089870 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zf6t2_df350c2f-f320-4324-915f-7ba97845f4cf/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:08:37.107897 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:37.107875 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zf6t2_df350c2f-f320-4324-915f-7ba97845f4cf/northd/0.log" Apr 22 19:08:37.124930 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:37.124906 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zf6t2_df350c2f-f320-4324-915f-7ba97845f4cf/nbdb/0.log" Apr 22 19:08:37.146396 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:37.146376 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zf6t2_df350c2f-f320-4324-915f-7ba97845f4cf/sbdb/0.log" Apr 22 19:08:37.339035 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:37.338951 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zf6t2_df350c2f-f320-4324-915f-7ba97845f4cf/ovnkube-controller/0.log" Apr 22 19:08:38.264338 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:38.264305 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9g8sf_0b614b94-c0eb-4f39-add0-c46922389f94/network-check-target-container/0.log" Apr 22 19:08:39.101282 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:39.101255 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-49mmp_9bebf205-15a5-47cd-a9bd-2fe359cd3118/iptables-alerter/0.log" Apr 22 19:08:39.717983 ip-10-0-128-208 kubenswrapper[2562]: I0422 19:08:39.717959 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9khsc_977c5b80-af35-48df-9e61-f66a82bc4f6b/tuned/0.log"