Apr 21 03:54:45.236210 ip-10-0-128-88 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 03:54:45.236223 ip-10-0-128-88 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 03:54:45.236232 ip-10-0-128-88 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 03:54:45.236535 ip-10-0-128-88 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 03:54:55.276292 ip-10-0-128-88 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 03:54:55.276308 ip-10-0-128-88 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 79b89e1e1c884400b20bb9ad230e014d -- Apr 21 03:57:23.014341 ip-10-0-128-88 systemd[1]: Starting Kubernetes Kubelet... Apr 21 03:57:23.456850 ip-10-0-128-88 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:23.456850 ip-10-0-128-88 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 03:57:23.456850 ip-10-0-128-88 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:23.456850 ip-10-0-128-88 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 03:57:23.456850 ip-10-0-128-88 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:23.457553 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.457457 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 03:57:23.461194 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461179 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:23.461194 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461193 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461197 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461200 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461204 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461207 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461210 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461213 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461216 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461219 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461221 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461226 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461229 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461231 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461234 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461237 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461240 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461243 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461246 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461248 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461251 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:23.461256 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461254 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461257 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461262 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461265 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461268 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461271 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461274 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461276 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461279 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461281 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461284 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461287 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461289 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461292 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461300 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461303 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461306 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461308 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461311 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:23.461745 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461313 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461316 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461318 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461321 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461323 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461326 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461328 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461332 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461334 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461337 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461339 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461343 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461347 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461350 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461353 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461356 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461359 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461362 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461365 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461368 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:23.462231 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461371 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461373 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461377 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461381 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461384 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461387 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461390 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461392 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461396 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461398 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461401 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461403 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461406 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461408 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461411 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461414 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461417 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461419 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461422 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:23.462705 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461425 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461427 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461430 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461433 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461435 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461438 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461440 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461818 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461823 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461827 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461829 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461832 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461835 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461838 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461843 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461847 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461850 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461853 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461856 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:23.463192 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461858 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461861 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461864 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461867 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461869 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461872 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461874 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461876 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461879 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461883 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461885 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461888 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461890 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461893 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461895 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461898 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461901 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461904 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461906 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461908 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:23.463643 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461912 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461915 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461918 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461920 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461923 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461925 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461928 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461931 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461933 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461936 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461939 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461943 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461946 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461949 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461951 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461954 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461956 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461958 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461962 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:23.464144 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461964 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461967 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461969 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461972 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461975 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461977 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461980 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461982 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461985 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461988 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461990 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461993 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461996 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.461998 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462001 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462005 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462008 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462010 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462013 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462015 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:23.464610 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462018 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462020 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462023 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462025 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462028 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462031 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462034 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462036 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462038 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462041 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462043 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462046 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462049 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462052 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462054 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462140 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462152 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462161 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462168 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462175 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462180 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 03:57:23.465125 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462185 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462190 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462193 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462196 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462200 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462203 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462206 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462209 2580 flags.go:64] FLAG: --cgroup-root="" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462211 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462215 2580 flags.go:64] FLAG: --client-ca-file="" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462217 2580 flags.go:64] FLAG: --cloud-config="" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462220 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462223 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462227 2580 flags.go:64] FLAG: --cluster-domain="" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462230 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462234 2580 flags.go:64] FLAG: --config-dir="" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462236 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462240 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462244 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462247 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462250 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462253 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462257 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462260 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 03:57:23.465641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462263 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462267 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462270 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462276 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462280 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462283 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462286 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462289 2580 flags.go:64] FLAG: --enable-server="true" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462292 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462298 2580 flags.go:64] FLAG: --event-burst="100" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462301 2580 flags.go:64] FLAG: --event-qps="50" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462304 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462307 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462310 2580 flags.go:64] FLAG: --eviction-hard="" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462314 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462317 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462320 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462323 2580 flags.go:64] FLAG: --eviction-soft="" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462326 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462328 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462331 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462334 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462337 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462340 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462343 2580 flags.go:64] FLAG: --feature-gates="" Apr 21 03:57:23.466232 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462347 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462350 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462353 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462356 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462360 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462363 2580 flags.go:64] FLAG: --help="false" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462366 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462369 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462372 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462375 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462378 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462382 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462385 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462388 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462391 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462393 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462396 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462399 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462402 2580 flags.go:64] FLAG: --kube-reserved="" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462405 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462407 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462411 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462413 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462416 2580 flags.go:64] FLAG: --lock-file="" Apr 21 03:57:23.466870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462419 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462422 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462425 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462430 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462433 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462436 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462439 2580 flags.go:64] FLAG: --logging-format="text" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462446 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462450 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462453 2580 flags.go:64] FLAG: --manifest-url="" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462456 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462460 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462463 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462467 2580 flags.go:64] FLAG: --max-pods="110" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462470 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462473 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462476 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462479 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462482 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462485 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462488 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462495 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462498 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462501 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 03:57:23.467454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462504 2580 flags.go:64] FLAG: --pod-cidr="" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462507 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462513 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462516 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462519 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462522 2580 flags.go:64] FLAG: --port="10250" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462525 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462528 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-074ec2fbe02f02136" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462531 2580 flags.go:64] FLAG: --qos-reserved="" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462534 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462537 2580 flags.go:64] FLAG: --register-node="true" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462540 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462543 2580 flags.go:64] FLAG: --register-with-taints="" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462546 2580 flags.go:64] FLAG: --registry-burst="10" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462549 2580 flags.go:64] FLAG: --registry-qps="5" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462552 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462560 2580 flags.go:64] FLAG: --reserved-memory="" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462564 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462567 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462570 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462573 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462576 2580 flags.go:64] FLAG: --runonce="false" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462579 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462582 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462585 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 21 03:57:23.468061 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462587 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462590 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462594 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462597 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462600 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462609 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462612 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462614 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462617 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462620 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462623 2580 flags.go:64] FLAG: --system-cgroups="" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462626 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462632 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462634 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462637 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462641 2580 flags.go:64] FLAG: --tls-min-version="" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462644 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462646 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462649 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462652 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462655 2580 flags.go:64] FLAG: --v="2" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462659 2580 flags.go:64] FLAG: --version="false" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462663 2580 flags.go:64] FLAG: --vmodule="" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462669 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.462672 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 03:57:23.468664 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462765 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462769 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462772 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462775 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462777 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462780 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462782 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462785 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462788 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462790 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462793 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462795 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462798 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462801 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462803 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462806 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462824 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462828 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462833 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462838 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:23.469321 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462841 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462844 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462847 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462850 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462853 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462855 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462858 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462861 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462863 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462866 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462870 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462873 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462875 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462878 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462881 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462883 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462886 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462888 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462891 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:23.469822 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462893 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462896 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462898 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462901 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462903 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462905 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462908 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462911 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462915 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462918 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462920 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462923 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462925 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462928 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462930 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462932 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462935 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462937 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462940 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:23.470322 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462942 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462945 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462947 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462950 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462953 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462956 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462958 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462961 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462964 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462967 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462969 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462972 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462974 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462977 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462979 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462982 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462985 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462987 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462989 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462992 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:23.470770 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462994 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.462997 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.463000 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.463002 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.463005 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.463008 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.463010 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.463013 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.463758 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.470037 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.470052 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470130 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470136 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470141 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470144 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:23.471328 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470147 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470150 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470152 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470155 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470158 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470160 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470163 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470165 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470168 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470171 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470174 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470177 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470179 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470182 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470185 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470188 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470191 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470193 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470196 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470198 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:23.471746 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470201 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470204 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470206 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470209 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470212 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470214 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470217 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470221 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470224 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470227 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470230 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470234 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470238 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470241 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470244 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470246 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470248 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470251 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470254 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:23.472249 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470256 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470259 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470261 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470264 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470266 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470269 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470272 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470274 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470277 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470279 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470282 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470285 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470287 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470290 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470292 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470296 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470299 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470302 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470305 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470308 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:23.472704 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470311 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470314 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470316 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470319 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470322 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470324 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470327 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470330 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470332 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470335 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470338 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470340 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470343 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470345 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470347 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470350 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470352 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470355 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470357 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470360 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:23.473204 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470362 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470365 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470368 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.470373 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470474 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470479 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470482 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470485 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470488 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470491 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470493 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470496 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470498 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470501 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470504 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:23.473690 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470506 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470509 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470512 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470514 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470517 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470519 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470522 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470524 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470527 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470530 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470532 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470534 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470537 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470539 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470542 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470544 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470547 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470549 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470552 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470554 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:23.474048 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470556 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470559 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470562 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470564 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470567 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470571 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470575 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470577 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470580 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470583 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470585 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470588 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470591 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470593 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470596 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470599 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470601 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470603 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470606 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470608 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:23.474550 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470610 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470613 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470616 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470618 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470621 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470623 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470626 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470628 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470630 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470634 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470637 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470640 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470643 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470646 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470648 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470651 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470654 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470656 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470659 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:23.475036 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470661 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470664 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470666 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470669 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470672 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470674 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470677 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470680 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470682 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470684 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470687 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470689 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470692 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470694 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470697 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:23.470699 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:23.475515 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.470704 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:23.475918 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.471442 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 03:57:23.475918 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.473828 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 03:57:23.475918 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.474617 2580 server.go:1019] "Starting client certificate rotation" Apr 21 03:57:23.475918 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.474713 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:57:23.475918 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.474753 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:57:23.499897 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.499875 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:57:23.502836 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.502815 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:57:23.519779 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.519756 2580 log.go:25] "Validated CRI v1 runtime API" Apr 21 03:57:23.526296 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.526281 2580 log.go:25] "Validated CRI v1 image API" Apr 21 03:57:23.527529 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.527507 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 03:57:23.530563 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.530543 2580 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 87e7749d-202e-4b41-8917-279c95d7b519:/dev/nvme0n1p3 b2bf0822-9aa7-45e9-947f-9dd920d4eed1:/dev/nvme0n1p4] Apr 21 03:57:23.530628 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.530562 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 03:57:23.532294 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.532276 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:57:23.536317 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.536212 2580 manager.go:217] Machine: {Timestamp:2026-04-21 03:57:23.534217755 +0000 UTC m=+0.404873697 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101386 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20ae97c6d3b780ab01a42cfe6a3648 SystemUUID:ec20ae97-c6d3-b780-ab01-a42cfe6a3648 BootID:79b89e1e-1c88-4400-b20b-b9ad230e014d Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ad:52:85:d9:5b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ad:52:85:d9:5b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:84:54:24:be:09 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 03:57:23.536317 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.536312 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 03:57:23.536431 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.536419 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 03:57:23.538551 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.538528 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 03:57:23.538685 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.538554 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-88.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 03:57:23.538731 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.538694 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 03:57:23.538731 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.538703 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 03:57:23.538731 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.538716 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:57:23.539380 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.539368 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:57:23.540170 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.540160 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:57:23.540290 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.540281 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 03:57:23.542498 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.542488 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 21 03:57:23.542531 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.542507 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 03:57:23.542531 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.542519 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 03:57:23.542531 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.542529 2580 kubelet.go:397] "Adding apiserver pod source" Apr 21 03:57:23.542612 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.542538 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 03:57:23.543942 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.543931 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:57:23.543979 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.543950 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:57:23.546609 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.546593 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 03:57:23.547954 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.547942 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 03:57:23.548531 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.548516 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ngj92" Apr 21 03:57:23.549790 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549777 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 03:57:23.549852 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549795 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 03:57:23.549852 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549802 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 03:57:23.549852 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549807 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 03:57:23.549852 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549813 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 03:57:23.549852 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549819 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 03:57:23.549852 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549825 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 03:57:23.549852 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549831 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 03:57:23.549852 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549838 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 03:57:23.549852 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549844 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 03:57:23.550098 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549863 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 03:57:23.550098 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.549872 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 03:57:23.550877 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.550867 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 03:57:23.550877 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.550877 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 03:57:23.554467 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.554350 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 03:57:23.554543 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.554488 2580 server.go:1295] "Started kubelet" Apr 21 03:57:23.554593 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.554557 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 03:57:23.555915 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.555774 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 03:57:23.556011 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.555948 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 03:57:23.556136 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.556121 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-88.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 03:57:23.556247 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.556228 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-88.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 03:57:23.556307 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.556294 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 03:57:23.556489 ip-10-0-128-88 systemd[1]: Started Kubernetes Kubelet. Apr 21 03:57:23.557703 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.557452 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 03:57:23.557908 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.557762 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ngj92" Apr 21 03:57:23.558719 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.558700 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 21 03:57:23.566141 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.566121 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 03:57:23.566220 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.566129 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 03:57:23.566775 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.566759 2580 factory.go:55] Registering systemd factory Apr 21 03:57:23.566851 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.566784 2580 factory.go:223] Registration of the systemd container factory successfully Apr 21 03:57:23.567169 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.567136 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:23.567385 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567365 2580 factory.go:153] Registering CRI-O factory Apr 21 03:57:23.567385 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567386 2580 factory.go:223] Registration of the crio container factory successfully Apr 21 03:57:23.567499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567438 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 03:57:23.567499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567461 2580 factory.go:103] Registering Raw factory Apr 21 03:57:23.567499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567473 2580 manager.go:1196] Started watching for new ooms in manager Apr 21 03:57:23.567622 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567557 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 03:57:23.567622 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567561 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 03:57:23.567622 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567583 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 03:57:23.567740 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.567666 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 03:57:23.567740 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567719 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 21 03:57:23.567740 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567731 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 21 03:57:23.567892 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.567881 2580 manager.go:319] Starting recovery of all containers Apr 21 03:57:23.568477 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.568371 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:23.571328 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.571306 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-88.ec2.internal\" not found" node="ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.576985 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.576969 2580 manager.go:324] Recovery completed Apr 21 03:57:23.580912 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.580899 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:23.583276 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.583240 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:23.583276 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.583270 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:23.583387 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.583280 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:23.583781 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.583769 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 03:57:23.583781 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.583780 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 03:57:23.583866 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.583795 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:57:23.586380 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.586369 2580 policy_none.go:49] "None policy: Start" Apr 21 03:57:23.586421 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.586385 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 03:57:23.586421 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.586395 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.631826 2580 manager.go:341] "Starting Device Plugin manager" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.631899 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.631910 2580 server.go:85] "Starting device plugin registration server" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.632128 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.632138 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.632220 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.632304 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.632314 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.632848 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 03:57:23.636684 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.632884 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:23.699896 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.699860 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 03:57:23.701034 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.701012 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 03:57:23.701185 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.701041 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 03:57:23.701185 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.701058 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 03:57:23.701185 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.701064 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 03:57:23.701185 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.701114 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 03:57:23.703632 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.703605 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:23.732741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.732688 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:23.733543 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.733519 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:23.733606 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.733556 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:23.733606 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.733568 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:23.733606 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.733593 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.742800 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.742783 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.742841 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.742807 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-88.ec2.internal\": node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:23.766315 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.766296 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:23.801698 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.801668 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal"] Apr 21 03:57:23.801761 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.801738 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:23.802626 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.802611 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:23.802693 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.802639 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:23.802693 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.802649 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:23.803739 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.803727 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:23.803878 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.803865 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.803915 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.803892 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:23.804401 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.804384 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:23.804486 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.804416 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:23.804486 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.804384 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:23.804486 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.804452 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:23.804486 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.804466 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:23.804486 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.804427 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:23.805482 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.805469 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.805547 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.805493 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:23.806229 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.806214 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:23.806285 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.806246 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:23.806285 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.806260 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:23.832651 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.832632 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-88.ec2.internal\" not found" node="ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.836988 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.836972 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-88.ec2.internal\" not found" node="ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.866654 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.866630 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:23.869912 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.869898 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ba609bfa3d3d9b3d9a7e1cb784a13b9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal\" (UID: \"9ba609bfa3d3d9b3d9a7e1cb784a13b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.870010 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.869920 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ba609bfa3d3d9b3d9a7e1cb784a13b9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal\" (UID: \"9ba609bfa3d3d9b3d9a7e1cb784a13b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.870010 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.869938 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6fbea50e95cbda5ca95918851e7c31a8-config\") pod \"kube-apiserver-proxy-ip-10-0-128-88.ec2.internal\" (UID: \"6fbea50e95cbda5ca95918851e7c31a8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.967362 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:23.967323 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:23.970613 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.970599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ba609bfa3d3d9b3d9a7e1cb784a13b9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal\" (UID: \"9ba609bfa3d3d9b3d9a7e1cb784a13b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.970664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.970622 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ba609bfa3d3d9b3d9a7e1cb784a13b9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal\" (UID: \"9ba609bfa3d3d9b3d9a7e1cb784a13b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.970664 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.970645 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6fbea50e95cbda5ca95918851e7c31a8-config\") pod \"kube-apiserver-proxy-ip-10-0-128-88.ec2.internal\" (UID: \"6fbea50e95cbda5ca95918851e7c31a8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.970739 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.970691 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ba609bfa3d3d9b3d9a7e1cb784a13b9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal\" (UID: \"9ba609bfa3d3d9b3d9a7e1cb784a13b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.970739 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.970701 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ba609bfa3d3d9b3d9a7e1cb784a13b9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal\" (UID: \"9ba609bfa3d3d9b3d9a7e1cb784a13b9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" Apr 21 03:57:23.970805 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:23.970736 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6fbea50e95cbda5ca95918851e7c31a8-config\") pod \"kube-apiserver-proxy-ip-10-0-128-88.ec2.internal\" (UID: \"6fbea50e95cbda5ca95918851e7c31a8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal" Apr 21 03:57:24.068126 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:24.068049 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:24.134417 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.134393 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" Apr 21 03:57:24.139994 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.139977 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal" Apr 21 03:57:24.169038 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:24.169011 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:24.269494 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:24.269472 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:24.370041 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:24.370016 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:24.470572 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:24.470547 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-88.ec2.internal\" not found" Apr 21 03:57:24.474787 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.474770 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 03:57:24.474943 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.474927 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:24.474981 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.474948 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:24.524215 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.524189 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:24.560287 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.560253 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 03:52:23 +0000 UTC" deadline="2027-11-10 22:58:05.373184375 +0000 UTC" Apr 21 03:57:24.560287 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.560281 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13651h0m40.812906848s" Apr 21 03:57:24.561771 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.561753 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:24.567109 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.567073 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 03:57:24.567195 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.567119 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" Apr 21 03:57:24.575554 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.575505 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:57:24.577230 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:24.577202 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fbea50e95cbda5ca95918851e7c31a8.slice/crio-61c41f85ad2943b34af9b5f02face396a3c03f2c69846b57a5fb51d79b02dae9 WatchSource:0}: Error finding container 61c41f85ad2943b34af9b5f02face396a3c03f2c69846b57a5fb51d79b02dae9: Status 404 returned error can't find the container with id 61c41f85ad2943b34af9b5f02face396a3c03f2c69846b57a5fb51d79b02dae9 Apr 21 03:57:24.577792 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:24.577774 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ba609bfa3d3d9b3d9a7e1cb784a13b9.slice/crio-9ae79f92b3c73ab079385ab63b80d46dc145937d00955b8e93bea49becf40f78 WatchSource:0}: Error finding container 9ae79f92b3c73ab079385ab63b80d46dc145937d00955b8e93bea49becf40f78: Status 404 returned error can't find the container with id 9ae79f92b3c73ab079385ab63b80d46dc145937d00955b8e93bea49becf40f78 Apr 21 03:57:24.578389 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.578369 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:24.579357 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.579344 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal" Apr 21 03:57:24.582960 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.582946 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 03:57:24.585351 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.585309 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:24.595411 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.595394 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7pmq7" Apr 21 03:57:24.605617 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.605593 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7pmq7" Apr 21 03:57:24.704490 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.704429 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal" event={"ID":"6fbea50e95cbda5ca95918851e7c31a8","Type":"ContainerStarted","Data":"61c41f85ad2943b34af9b5f02face396a3c03f2c69846b57a5fb51d79b02dae9"} Apr 21 03:57:24.705368 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:24.705347 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" event={"ID":"9ba609bfa3d3d9b3d9a7e1cb784a13b9","Type":"ContainerStarted","Data":"9ae79f92b3c73ab079385ab63b80d46dc145937d00955b8e93bea49becf40f78"} Apr 21 03:57:25.415658 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.415631 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:25.543582 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.543557 2580 apiserver.go:52] "Watching apiserver" Apr 21 03:57:25.550897 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.550868 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 03:57:25.551981 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.551952 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-hjl4c","openshift-multus/network-metrics-daemon-x476t","openshift-ovn-kubernetes/ovnkube-node-gl9hq","kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal","openshift-cluster-node-tuning-operator/tuned-bgswk","openshift-dns/node-resolver-vjbp2","openshift-image-registry/node-ca-lrfz2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal","openshift-multus/multus-29b2g","openshift-network-diagnostics/network-check-target-f65gr","openshift-network-operator/iptables-alerter-m8ps6","kube-system/konnectivity-agent-8dh6m","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp"] Apr 21 03:57:25.555937 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.555919 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:25.556027 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:25.555999 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:25.556099 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.556054 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:25.556182 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:25.556162 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:25.557174 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.557156 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.558340 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.558321 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.558912 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.558896 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 03:57:25.559349 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.559139 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 03:57:25.559349 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.559158 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 03:57:25.559349 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.559210 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-lwmrt\"" Apr 21 03:57:25.559349 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.559145 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 03:57:25.559581 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.559435 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 03:57:25.559581 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.559473 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 03:57:25.559677 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.559634 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.560112 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.559822 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:25.560112 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.559974 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vbnx2\"" Apr 21 03:57:25.560112 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.560024 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:25.560883 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.560869 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.561358 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.561249 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 03:57:25.561358 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.561259 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 03:57:25.561358 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.561292 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8qztf\"" Apr 21 03:57:25.562385 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.562371 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.562462 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.562436 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 03:57:25.562978 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.562832 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 03:57:25.562978 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.562857 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 03:57:25.562978 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.562872 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kkx9s\"" Apr 21 03:57:25.563756 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.563737 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.563832 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.563770 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 03:57:25.564003 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.563953 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 03:57:25.564385 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.564367 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 03:57:25.564498 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.564473 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6zlb8\"" Apr 21 03:57:25.564616 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.564565 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 03:57:25.565355 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.565336 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 03:57:25.565449 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.565385 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rvcl2\"" Apr 21 03:57:25.565658 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.565641 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 03:57:25.566446 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.566427 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.566547 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.566536 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:25.567653 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.567638 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.568659 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.568640 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dr94c\"" Apr 21 03:57:25.568659 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.568655 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:25.568813 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.568716 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8d2pz\"" Apr 21 03:57:25.568870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.568845 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 03:57:25.568870 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.568864 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 03:57:25.569115 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.569021 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 03:57:25.570036 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.569447 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:25.570036 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.569610 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 03:57:25.570036 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.569651 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-765kf\"" Apr 21 03:57:25.570244 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.570159 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 03:57:25.570244 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.570222 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 03:57:25.571307 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.570958 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 03:57:25.580289 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580269 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-socket-dir-parent\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.580380 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580297 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-sysctl-d\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.580380 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580316 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-var-lib-kubelet\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.580380 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580340 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8db5c95c-dcdd-437d-bbd8-b52b4146dc61-tmp-dir\") pod \"node-resolver-vjbp2\" (UID: \"8db5c95c-dcdd-437d-bbd8-b52b4146dc61\") " pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.580380 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580363 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-var-lib-cni-bin\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.580585 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580437 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/798393e0-1967-4ff8-bdbd-5debf844db1d-cni-binary-copy\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.580585 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580476 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/798393e0-1967-4ff8-bdbd-5debf844db1d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.580585 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580503 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-kubelet\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.580585 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580528 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-sysconfig\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.580585 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580550 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-sysctl-conf\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.580585 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580576 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-run-ovn\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.580828 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580599 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a9c3508-2c05-4c97-851c-899383bc9ca7-ovnkube-config\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.580828 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a9c3508-2c05-4c97-851c-899383bc9ca7-ovn-node-metrics-cert\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.580828 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580644 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-kubernetes\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.580828 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580688 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c7de1195-0825-476e-b0d2-fdf06e76e365-tmp\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.580828 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580724 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-os-release\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.580828 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580751 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-cni-netd\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.580828 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580777 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a9c3508-2c05-4c97-851c-899383bc9ca7-ovnkube-script-lib\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.580828 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580800 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfr4s\" (UniqueName: \"kubernetes.io/projected/3a9c3508-2c05-4c97-851c-899383bc9ca7-kube-api-access-hfr4s\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.581137 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580865 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-device-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.581137 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580889 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwv47\" (UniqueName: \"kubernetes.io/projected/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-kube-api-access-wwv47\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:25.581137 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580928 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-cnibin\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.581137 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.580956 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fd542f8-4ff1-46b3-821d-17015eac9ffa-cni-binary-copy\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.581137 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581032 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-modprobe-d\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.581137 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581056 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aaab1022-57cd-4e71-8136-36d25cbe7fa1-host\") pod \"node-ca-lrfz2\" (UID: \"aaab1022-57cd-4e71-8136-36d25cbe7fa1\") " pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.581137 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581105 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3311248f-8a11-446e-8162-2933f3299d2d-iptables-alerter-script\") pod \"iptables-alerter-m8ps6\" (UID: \"3311248f-8a11-446e-8162-2933f3299d2d\") " pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.581137 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581135 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-registration-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581158 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54qh4\" (UniqueName: \"kubernetes.io/projected/c7de1195-0825-476e-b0d2-fdf06e76e365-kube-api-access-54qh4\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581181 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8db5c95c-dcdd-437d-bbd8-b52b4146dc61-hosts-file\") pod \"node-resolver-vjbp2\" (UID: \"8db5c95c-dcdd-437d-bbd8-b52b4146dc61\") " pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581204 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8f92\" (UniqueName: \"kubernetes.io/projected/798393e0-1967-4ff8-bdbd-5debf844db1d-kube-api-access-c8f92\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581231 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-run-openvswitch\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581265 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-sys\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581307 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-tuned\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581336 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-conf-dir\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581354 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjj4s\" (UniqueName: \"kubernetes.io/projected/8fd542f8-4ff1-46b3-821d-17015eac9ffa-kube-api-access-gjj4s\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581370 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-systemd\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581386 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-run-systemd\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581404 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-var-lib-openvswitch\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581423 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a9c3508-2c05-4c97-851c-899383bc9ca7-env-overrides\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581440 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aaab1022-57cd-4e71-8136-36d25cbe7fa1-serviceca\") pod \"node-ca-lrfz2\" (UID: \"aaab1022-57cd-4e71-8136-36d25cbe7fa1\") " pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581460 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4hgl\" (UniqueName: \"kubernetes.io/projected/aaab1022-57cd-4e71-8136-36d25cbe7fa1-kube-api-access-f4hgl\") pod \"node-ca-lrfz2\" (UID: \"aaab1022-57cd-4e71-8136-36d25cbe7fa1\") " pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581475 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7skt\" (UniqueName: \"kubernetes.io/projected/3311248f-8a11-446e-8162-2933f3299d2d-kube-api-access-d7skt\") pod \"iptables-alerter-m8ps6\" (UID: \"3311248f-8a11-446e-8162-2933f3299d2d\") " pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.581487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581496 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/753c9f07-568c-4fcd-a6b8-25bada9bac1b-konnectivity-ca\") pod \"konnectivity-agent-8dh6m\" (UID: \"753c9f07-568c-4fcd-a6b8-25bada9bac1b\") " pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581512 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-etc-openvswitch\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581547 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581587 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-run\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmthr\" (UniqueName: \"kubernetes.io/projected/8db5c95c-dcdd-437d-bbd8-b52b4146dc61-kube-api-access-hmthr\") pod \"node-resolver-vjbp2\" (UID: \"8db5c95c-dcdd-437d-bbd8-b52b4146dc61\") " pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581656 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-system-cni-dir\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581679 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-cni-dir\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581701 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-run-k8s-cni-cncf-io\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581722 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-lib-modules\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581745 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581767 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/798393e0-1967-4ff8-bdbd-5debf844db1d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581790 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-run-ovn-kubernetes\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581822 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/753c9f07-568c-4fcd-a6b8-25bada9bac1b-agent-certs\") pod \"konnectivity-agent-8dh6m\" (UID: \"753c9f07-568c-4fcd-a6b8-25bada9bac1b\") " pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-895wt\" (UniqueName: \"kubernetes.io/projected/8b022fb0-5dad-478d-8300-571165261cef-kube-api-access-895wt\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581901 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-var-lib-cni-multus\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581933 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-run-netns\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.582242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.581961 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-cni-bin\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582009 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582059 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-os-release\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582105 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-daemon-config\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582135 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-systemd-units\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582161 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-node-log\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-host\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582207 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-run-netns\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582230 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-var-lib-kubelet\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582258 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-system-cni-dir\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582281 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-slash\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582304 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-socket-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582325 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-hostroot\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582350 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-etc-kubernetes\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582373 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-cnibin\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582395 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:25.582948 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582417 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3311248f-8a11-446e-8162-2933f3299d2d-host-slash\") pod \"iptables-alerter-m8ps6\" (UID: \"3311248f-8a11-446e-8162-2933f3299d2d\") " pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.583584 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582473 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.583584 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582498 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-etc-selinux\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.583584 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582523 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-sys-fs\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.583584 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582546 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-run-multus-certs\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.583584 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.582570 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-log-socket\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.606799 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.606771 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:24 +0000 UTC" deadline="2027-11-24 16:58:25.151657796 +0000 UTC" Apr 21 03:57:25.606885 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.606800 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13981h0m59.544861437s" Apr 21 03:57:25.683175 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683070 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-cni-dir\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.683175 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683126 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-run-k8s-cni-cncf-io\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.683175 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683150 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-lib-modules\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.683175 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683174 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683180 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-cni-dir\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683197 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/798393e0-1967-4ff8-bdbd-5debf844db1d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683230 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-run-ovn-kubernetes\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683270 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/753c9f07-568c-4fcd-a6b8-25bada9bac1b-agent-certs\") pod \"konnectivity-agent-8dh6m\" (UID: \"753c9f07-568c-4fcd-a6b8-25bada9bac1b\") " pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-895wt\" (UniqueName: \"kubernetes.io/projected/8b022fb0-5dad-478d-8300-571165261cef-kube-api-access-895wt\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683324 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-lib-modules\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-var-lib-cni-multus\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683350 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-run-ovn-kubernetes\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683366 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-run-netns\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683392 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-run-k8s-cni-cncf-io\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-cni-bin\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683411 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683419 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.683475 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683468 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-os-release\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683482 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-var-lib-cni-multus\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683495 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-daemon-config\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683509 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683520 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-systemd-units\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683545 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-node-log\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683559 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-run-netns\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683567 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-host\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683436 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-cni-bin\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683593 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-run-netns\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683697 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-node-log\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683689 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-run-netns\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683739 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-systemd-units\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-var-lib-kubelet\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683749 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-host\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683767 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/798393e0-1967-4ff8-bdbd-5debf844db1d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683794 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-os-release\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683794 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-var-lib-kubelet\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684072 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683799 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-system-cni-dir\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-system-cni-dir\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683861 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-slash\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683887 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-socket-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683919 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-hostroot\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683959 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-slash\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.683963 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-hostroot\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684363 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-daemon-config\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684385 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684395 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-socket-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684445 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-etc-kubernetes\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684480 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-cnibin\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684512 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684516 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-etc-kubernetes\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684544 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3311248f-8a11-446e-8162-2933f3299d2d-host-slash\") pod \"iptables-alerter-m8ps6\" (UID: \"3311248f-8a11-446e-8162-2933f3299d2d\") " pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684600 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-etc-selinux\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684603 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3311248f-8a11-446e-8162-2933f3299d2d-host-slash\") pod \"iptables-alerter-m8ps6\" (UID: \"3311248f-8a11-446e-8162-2933f3299d2d\") " pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.684879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-sys-fs\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684650 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-cnibin\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684662 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-run-multus-certs\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684692 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-log-socket\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684716 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-socket-dir-parent\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684717 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-etc-selinux\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684784 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684786 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-socket-dir-parent\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684833 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-run-multus-certs\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.684889 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-log-socket\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685052 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-sysctl-d\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685096 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-var-lib-kubelet\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685129 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8db5c95c-dcdd-437d-bbd8-b52b4146dc61-tmp-dir\") pod \"node-resolver-vjbp2\" (UID: \"8db5c95c-dcdd-437d-bbd8-b52b4146dc61\") " pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685159 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-var-lib-cni-bin\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685189 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/798393e0-1967-4ff8-bdbd-5debf844db1d-cni-binary-copy\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/798393e0-1967-4ff8-bdbd-5debf844db1d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685250 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-kubelet\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.685741 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685284 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-sysconfig\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685302 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-sys-fs\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685414 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-sysctl-d\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685524 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8db5c95c-dcdd-437d-bbd8-b52b4146dc61-tmp-dir\") pod \"node-resolver-vjbp2\" (UID: \"8db5c95c-dcdd-437d-bbd8-b52b4146dc61\") " pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685546 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-host-var-lib-cni-bin\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685596 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-var-lib-kubelet\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685605 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-sysctl-conf\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-run-ovn\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685655 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-kubelet\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685684 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a9c3508-2c05-4c97-851c-899383bc9ca7-ovnkube-config\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a9c3508-2c05-4c97-851c-899383bc9ca7-ovn-node-metrics-cert\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685771 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-sysconfig\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685782 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-sysctl-conf\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685828 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-kubernetes\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685868 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c7de1195-0825-476e-b0d2-fdf06e76e365-tmp\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685903 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-os-release\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685938 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-cni-netd\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.686528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685972 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a9c3508-2c05-4c97-851c-899383bc9ca7-ovnkube-script-lib\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.685989 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/798393e0-1967-4ff8-bdbd-5debf844db1d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686000 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfr4s\" (UniqueName: \"kubernetes.io/projected/3a9c3508-2c05-4c97-851c-899383bc9ca7-kube-api-access-hfr4s\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686032 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-device-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686104 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwv47\" (UniqueName: \"kubernetes.io/projected/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-kube-api-access-wwv47\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686120 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-run-ovn\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686138 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-cnibin\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686172 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fd542f8-4ff1-46b3-821d-17015eac9ffa-cni-binary-copy\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686204 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-modprobe-d\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686209 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/798393e0-1967-4ff8-bdbd-5debf844db1d-os-release\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686245 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aaab1022-57cd-4e71-8136-36d25cbe7fa1-host\") pod \"node-ca-lrfz2\" (UID: \"aaab1022-57cd-4e71-8136-36d25cbe7fa1\") " pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686273 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3311248f-8a11-446e-8162-2933f3299d2d-iptables-alerter-script\") pod \"iptables-alerter-m8ps6\" (UID: \"3311248f-8a11-446e-8162-2933f3299d2d\") " pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686296 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-modprobe-d\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-registration-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686359 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-registration-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686398 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-host-cni-netd\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686403 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a9c3508-2c05-4c97-851c-899383bc9ca7-ovnkube-config\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.687314 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8b022fb0-5dad-478d-8300-571165261cef-device-dir\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686809 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a9c3508-2c05-4c97-851c-899383bc9ca7-ovnkube-script-lib\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686862 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54qh4\" (UniqueName: \"kubernetes.io/projected/c7de1195-0825-476e-b0d2-fdf06e76e365-kube-api-access-54qh4\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686891 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8db5c95c-dcdd-437d-bbd8-b52b4146dc61-hosts-file\") pod \"node-resolver-vjbp2\" (UID: \"8db5c95c-dcdd-437d-bbd8-b52b4146dc61\") " pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8f92\" (UniqueName: \"kubernetes.io/projected/798393e0-1967-4ff8-bdbd-5debf844db1d-kube-api-access-c8f92\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686960 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-run-openvswitch\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.686991 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-sys\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-tuned\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687047 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-conf-dir\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687098 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjj4s\" (UniqueName: \"kubernetes.io/projected/8fd542f8-4ff1-46b3-821d-17015eac9ffa-kube-api-access-gjj4s\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687130 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-systemd\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687160 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-run-systemd\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-var-lib-openvswitch\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a9c3508-2c05-4c97-851c-899383bc9ca7-env-overrides\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687250 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aaab1022-57cd-4e71-8136-36d25cbe7fa1-serviceca\") pod \"node-ca-lrfz2\" (UID: \"aaab1022-57cd-4e71-8136-36d25cbe7fa1\") " pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687280 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4hgl\" (UniqueName: \"kubernetes.io/projected/aaab1022-57cd-4e71-8136-36d25cbe7fa1-kube-api-access-f4hgl\") pod \"node-ca-lrfz2\" (UID: \"aaab1022-57cd-4e71-8136-36d25cbe7fa1\") " pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687308 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3311248f-8a11-446e-8162-2933f3299d2d-iptables-alerter-script\") pod \"iptables-alerter-m8ps6\" (UID: \"3311248f-8a11-446e-8162-2933f3299d2d\") " pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.688097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687374 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fd542f8-4ff1-46b3-821d-17015eac9ffa-cni-binary-copy\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687312 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7skt\" (UniqueName: \"kubernetes.io/projected/3311248f-8a11-446e-8162-2933f3299d2d-kube-api-access-d7skt\") pod \"iptables-alerter-m8ps6\" (UID: \"3311248f-8a11-446e-8162-2933f3299d2d\") " pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687451 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aaab1022-57cd-4e71-8136-36d25cbe7fa1-host\") pod \"node-ca-lrfz2\" (UID: \"aaab1022-57cd-4e71-8136-36d25cbe7fa1\") " pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687465 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-cnibin\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687592 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-systemd\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687651 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-run-systemd\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687705 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-var-lib-openvswitch\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.687751 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8db5c95c-dcdd-437d-bbd8-b52b4146dc61-hosts-file\") pod \"node-resolver-vjbp2\" (UID: \"8db5c95c-dcdd-437d-bbd8-b52b4146dc61\") " pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.688010 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-run-openvswitch\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.688070 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-sys\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.688133 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a9c3508-2c05-4c97-851c-899383bc9ca7-env-overrides\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.688301 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/753c9f07-568c-4fcd-a6b8-25bada9bac1b-agent-certs\") pod \"konnectivity-agent-8dh6m\" (UID: \"753c9f07-568c-4fcd-a6b8-25bada9bac1b\") " pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.688398 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-multus-conf-dir\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.688515 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/aaab1022-57cd-4e71-8136-36d25cbe7fa1-serviceca\") pod \"node-ca-lrfz2\" (UID: \"aaab1022-57cd-4e71-8136-36d25cbe7fa1\") " pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.688875 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.688866 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-kubernetes\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.689499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.688914 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a9c3508-2c05-4c97-851c-899383bc9ca7-ovn-node-metrics-cert\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.689499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.689259 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/753c9f07-568c-4fcd-a6b8-25bada9bac1b-konnectivity-ca\") pod \"konnectivity-agent-8dh6m\" (UID: \"753c9f07-568c-4fcd-a6b8-25bada9bac1b\") " pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:25.689499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.689304 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-etc-openvswitch\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.689499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.689342 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:25.689499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.689369 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-run\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.689499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.689407 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmthr\" (UniqueName: \"kubernetes.io/projected/8db5c95c-dcdd-437d-bbd8-b52b4146dc61-kube-api-access-hmthr\") pod \"node-resolver-vjbp2\" (UID: \"8db5c95c-dcdd-437d-bbd8-b52b4146dc61\") " pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.689499 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.689445 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-system-cni-dir\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.689829 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.689558 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fd542f8-4ff1-46b3-821d-17015eac9ffa-system-cni-dir\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.689829 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.689617 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a9c3508-2c05-4c97-851c-899383bc9ca7-etc-openvswitch\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.689829 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:25.689721 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:25.689829 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.689745 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/753c9f07-568c-4fcd-a6b8-25bada9bac1b-konnectivity-ca\") pod \"konnectivity-agent-8dh6m\" (UID: \"753c9f07-568c-4fcd-a6b8-25bada9bac1b\") " pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:25.689829 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:25.689808 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs podName:8746933a-dcd1-407c-8ebf-6ce3af9d58c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:26.189779777 +0000 UTC m=+3.060435709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs") pod "network-metrics-daemon-x476t" (UID: "8746933a-dcd1-407c-8ebf-6ce3af9d58c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:25.690097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.690061 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c7de1195-0825-476e-b0d2-fdf06e76e365-run\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.690558 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.690535 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c7de1195-0825-476e-b0d2-fdf06e76e365-etc-tuned\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.691138 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.690727 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/798393e0-1967-4ff8-bdbd-5debf844db1d-cni-binary-copy\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.691138 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:25.690834 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:25.691138 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:25.690850 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:25.691138 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:25.690868 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qdgrn for pod openshift-network-diagnostics/network-check-target-f65gr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:25.691138 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:25.690937 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn podName:89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:26.190914017 +0000 UTC m=+3.061569954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qdgrn" (UniqueName: "kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn") pod "network-check-target-f65gr" (UID: "89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:25.691507 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.691484 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c7de1195-0825-476e-b0d2-fdf06e76e365-tmp\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.693960 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.693753 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-895wt\" (UniqueName: \"kubernetes.io/projected/8b022fb0-5dad-478d-8300-571165261cef-kube-api-access-895wt\") pod \"aws-ebs-csi-driver-node-q5krp\" (UID: \"8b022fb0-5dad-478d-8300-571165261cef\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:25.693960 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.693924 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfr4s\" (UniqueName: \"kubernetes.io/projected/3a9c3508-2c05-4c97-851c-899383bc9ca7-kube-api-access-hfr4s\") pod \"ovnkube-node-gl9hq\" (UID: \"3a9c3508-2c05-4c97-851c-899383bc9ca7\") " pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.693960 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.693926 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwv47\" (UniqueName: \"kubernetes.io/projected/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-kube-api-access-wwv47\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:25.695669 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.695641 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54qh4\" (UniqueName: \"kubernetes.io/projected/c7de1195-0825-476e-b0d2-fdf06e76e365-kube-api-access-54qh4\") pod \"tuned-bgswk\" (UID: \"c7de1195-0825-476e-b0d2-fdf06e76e365\") " pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.696041 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.696006 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjj4s\" (UniqueName: \"kubernetes.io/projected/8fd542f8-4ff1-46b3-821d-17015eac9ffa-kube-api-access-gjj4s\") pod \"multus-29b2g\" (UID: \"8fd542f8-4ff1-46b3-821d-17015eac9ffa\") " pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.696459 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.696419 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4hgl\" (UniqueName: \"kubernetes.io/projected/aaab1022-57cd-4e71-8136-36d25cbe7fa1-kube-api-access-f4hgl\") pod \"node-ca-lrfz2\" (UID: \"aaab1022-57cd-4e71-8136-36d25cbe7fa1\") " pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.696545 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.696482 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7skt\" (UniqueName: \"kubernetes.io/projected/3311248f-8a11-446e-8162-2933f3299d2d-kube-api-access-d7skt\") pod \"iptables-alerter-m8ps6\" (UID: \"3311248f-8a11-446e-8162-2933f3299d2d\") " pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.697742 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.697715 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8f92\" (UniqueName: \"kubernetes.io/projected/798393e0-1967-4ff8-bdbd-5debf844db1d-kube-api-access-c8f92\") pod \"multus-additional-cni-plugins-hjl4c\" (UID: \"798393e0-1967-4ff8-bdbd-5debf844db1d\") " pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.699010 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.698991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmthr\" (UniqueName: \"kubernetes.io/projected/8db5c95c-dcdd-437d-bbd8-b52b4146dc61-kube-api-access-hmthr\") pod \"node-resolver-vjbp2\" (UID: \"8db5c95c-dcdd-437d-bbd8-b52b4146dc61\") " pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.777206 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.777178 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:25.869012 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.868969 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:25.879042 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.879020 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bgswk" Apr 21 03:57:25.886694 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.886674 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vjbp2" Apr 21 03:57:25.892361 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.892336 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lrfz2" Apr 21 03:57:25.898657 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.898637 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-29b2g" Apr 21 03:57:25.905229 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.905210 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" Apr 21 03:57:25.911775 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.911751 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m8ps6" Apr 21 03:57:25.918346 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.918327 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:25.922878 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:25.922862 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" Apr 21 03:57:26.071662 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:26.071637 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fd542f8_4ff1_46b3_821d_17015eac9ffa.slice/crio-d9a5e3085f42a8d9e6661f7501f1c0e4a57d11b5ed8c26dadbca6bad6b8e90b5 WatchSource:0}: Error finding container d9a5e3085f42a8d9e6661f7501f1c0e4a57d11b5ed8c26dadbca6bad6b8e90b5: Status 404 returned error can't find the container with id d9a5e3085f42a8d9e6661f7501f1c0e4a57d11b5ed8c26dadbca6bad6b8e90b5 Apr 21 03:57:26.072774 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:26.072740 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db5c95c_dcdd_437d_bbd8_b52b4146dc61.slice/crio-b50ffe5cf7aa235d82580e584fd4d9ab5606ed6ab886bb429987112170570d0a WatchSource:0}: Error finding container b50ffe5cf7aa235d82580e584fd4d9ab5606ed6ab886bb429987112170570d0a: Status 404 returned error can't find the container with id b50ffe5cf7aa235d82580e584fd4d9ab5606ed6ab886bb429987112170570d0a Apr 21 03:57:26.076569 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:26.076544 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaab1022_57cd_4e71_8136_36d25cbe7fa1.slice/crio-02501fe2e6ce7764f5b31b5bce2aebbdf9f993e4a4d37889f06d1a361cd4991f WatchSource:0}: Error finding container 02501fe2e6ce7764f5b31b5bce2aebbdf9f993e4a4d37889f06d1a361cd4991f: Status 404 returned error can't find the container with id 02501fe2e6ce7764f5b31b5bce2aebbdf9f993e4a4d37889f06d1a361cd4991f Apr 21 03:57:26.077220 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:26.077195 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a9c3508_2c05_4c97_851c_899383bc9ca7.slice/crio-e2ce611929ad9f9eaac72404324ab2934e7bd4f7e0a80a809877bad024e196b6 WatchSource:0}: Error finding container e2ce611929ad9f9eaac72404324ab2934e7bd4f7e0a80a809877bad024e196b6: Status 404 returned error can't find the container with id e2ce611929ad9f9eaac72404324ab2934e7bd4f7e0a80a809877bad024e196b6 Apr 21 03:57:26.078312 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:26.078294 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod798393e0_1967_4ff8_bdbd_5debf844db1d.slice/crio-d8a7d1c89232aed6fc6653390ae5e7ea14e88fa99668ad503b250326032e7400 WatchSource:0}: Error finding container d8a7d1c89232aed6fc6653390ae5e7ea14e88fa99668ad503b250326032e7400: Status 404 returned error can't find the container with id d8a7d1c89232aed6fc6653390ae5e7ea14e88fa99668ad503b250326032e7400 Apr 21 03:57:26.191627 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.191598 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:26.191720 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.191675 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:26.191854 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:26.191736 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:26.191854 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:26.191755 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:26.191854 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:26.191757 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:26.191854 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:26.191764 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qdgrn for pod openshift-network-diagnostics/network-check-target-f65gr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:26.191854 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:26.191800 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs podName:8746933a-dcd1-407c-8ebf-6ce3af9d58c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:27.191787325 +0000 UTC m=+4.062443254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs") pod "network-metrics-daemon-x476t" (UID: "8746933a-dcd1-407c-8ebf-6ce3af9d58c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:26.191854 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:26.191814 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn podName:89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:27.191806479 +0000 UTC m=+4.062462407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qdgrn" (UniqueName: "kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn") pod "network-check-target-f65gr" (UID: "89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:26.608527 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.608490 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:24 +0000 UTC" deadline="2027-09-17 13:58:50.698901934 +0000 UTC" Apr 21 03:57:26.608527 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.608525 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12346h1m24.090380628s" Apr 21 03:57:26.705822 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.705322 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:26.705822 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:26.705460 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:26.711361 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.711293 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" event={"ID":"798393e0-1967-4ff8-bdbd-5debf844db1d","Type":"ContainerStarted","Data":"d8a7d1c89232aed6fc6653390ae5e7ea14e88fa99668ad503b250326032e7400"} Apr 21 03:57:26.714218 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.714153 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vjbp2" event={"ID":"8db5c95c-dcdd-437d-bbd8-b52b4146dc61","Type":"ContainerStarted","Data":"b50ffe5cf7aa235d82580e584fd4d9ab5606ed6ab886bb429987112170570d0a"} Apr 21 03:57:26.729183 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.729155 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-29b2g" event={"ID":"8fd542f8-4ff1-46b3-821d-17015eac9ffa","Type":"ContainerStarted","Data":"d9a5e3085f42a8d9e6661f7501f1c0e4a57d11b5ed8c26dadbca6bad6b8e90b5"} Apr 21 03:57:26.732808 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.732780 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal" event={"ID":"6fbea50e95cbda5ca95918851e7c31a8","Type":"ContainerStarted","Data":"b3dd4c5b853ab5e194682b2335a0b2e5759c676ae3d772560bbc42518e5673ad"} Apr 21 03:57:26.744282 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.744254 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8dh6m" event={"ID":"753c9f07-568c-4fcd-a6b8-25bada9bac1b","Type":"ContainerStarted","Data":"64f294d0847e2db4f098a9a0fb7a62f964031cee7128003a41c300bfd5f89006"} Apr 21 03:57:26.746676 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.746647 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" event={"ID":"3a9c3508-2c05-4c97-851c-899383bc9ca7","Type":"ContainerStarted","Data":"e2ce611929ad9f9eaac72404324ab2934e7bd4f7e0a80a809877bad024e196b6"} Apr 21 03:57:26.755642 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.755614 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lrfz2" event={"ID":"aaab1022-57cd-4e71-8136-36d25cbe7fa1","Type":"ContainerStarted","Data":"02501fe2e6ce7764f5b31b5bce2aebbdf9f993e4a4d37889f06d1a361cd4991f"} Apr 21 03:57:26.764821 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.764788 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" event={"ID":"8b022fb0-5dad-478d-8300-571165261cef","Type":"ContainerStarted","Data":"255f25b2d61d933b9347e4f2a7f65bf08bc2d8a8f572c055a0fc72250a5d1c81"} Apr 21 03:57:26.778357 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.778331 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m8ps6" event={"ID":"3311248f-8a11-446e-8162-2933f3299d2d","Type":"ContainerStarted","Data":"78c987c62daa5c0ed102a1e40e0cf52ddefe9c7e26a515365a593b2c8da07b18"} Apr 21 03:57:26.780383 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:26.780361 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bgswk" event={"ID":"c7de1195-0825-476e-b0d2-fdf06e76e365","Type":"ContainerStarted","Data":"bc6673e3992a1bbddc24c6a8755eea0be1f8a5fafb6b3d63c8918a5aaf1b8d8c"} Apr 21 03:57:27.203780 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:27.203741 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:27.203944 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:27.203835 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:27.204024 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:27.204006 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:27.204122 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:27.204110 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs podName:8746933a-dcd1-407c-8ebf-6ce3af9d58c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:29.204090006 +0000 UTC m=+6.074745948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs") pod "network-metrics-daemon-x476t" (UID: "8746933a-dcd1-407c-8ebf-6ce3af9d58c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:27.204565 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:27.204545 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:27.204644 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:27.204574 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:27.204644 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:27.204588 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qdgrn for pod openshift-network-diagnostics/network-check-target-f65gr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:27.204644 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:27.204635 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn podName:89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:29.204619615 +0000 UTC m=+6.075275549 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qdgrn" (UniqueName: "kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn") pod "network-check-target-f65gr" (UID: "89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:27.702324 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:27.701797 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:27.702324 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:27.701916 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:27.795089 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:27.795040 2580 generic.go:358] "Generic (PLEG): container finished" podID="9ba609bfa3d3d9b3d9a7e1cb784a13b9" containerID="4de38617859789ca41a16a56d59d46f70d8cfbc733e63aa382961d0c356522ab" exitCode=0 Apr 21 03:57:27.795549 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:27.795522 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" event={"ID":"9ba609bfa3d3d9b3d9a7e1cb784a13b9","Type":"ContainerDied","Data":"4de38617859789ca41a16a56d59d46f70d8cfbc733e63aa382961d0c356522ab"} Apr 21 03:57:27.813247 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:27.811989 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-88.ec2.internal" podStartSLOduration=3.811972583 podStartE2EDuration="3.811972583s" podCreationTimestamp="2026-04-21 03:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:26.750531573 +0000 UTC m=+3.621187524" watchObservedRunningTime="2026-04-21 03:57:27.811972583 +0000 UTC m=+4.682628537" Apr 21 03:57:28.701616 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:28.701583 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:28.701806 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:28.701723 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:28.805150 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:28.804625 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" event={"ID":"9ba609bfa3d3d9b3d9a7e1cb784a13b9","Type":"ContainerStarted","Data":"5581c1c988fa69f51af3bfd3e302a01c4f933d5654aec887a98b5aaed2d43719"} Apr 21 03:57:29.220838 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:29.220726 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:29.220838 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:29.220812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:29.221119 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:29.220922 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:29.221119 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:29.220951 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:29.221119 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:29.220965 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qdgrn for pod openshift-network-diagnostics/network-check-target-f65gr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:29.221119 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:29.221026 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn podName:89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:33.22100644 +0000 UTC m=+10.091662383 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qdgrn" (UniqueName: "kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn") pod "network-check-target-f65gr" (UID: "89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:29.221425 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:29.220932 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:29.221495 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:29.221462 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs podName:8746933a-dcd1-407c-8ebf-6ce3af9d58c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:33.221446886 +0000 UTC m=+10.092102820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs") pod "network-metrics-daemon-x476t" (UID: "8746933a-dcd1-407c-8ebf-6ce3af9d58c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:29.702441 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:29.702358 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:29.702607 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:29.702495 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:30.701464 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:30.701425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:30.701915 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:30.701575 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:31.701629 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:31.701598 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:31.702067 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:31.701716 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:32.702031 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:32.701998 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:32.702498 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:32.702155 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:33.258823 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:33.258491 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:33.258823 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:33.258558 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:33.258823 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:33.258625 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:33.258823 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:33.258695 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:33.258823 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:33.258708 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs podName:8746933a-dcd1-407c-8ebf-6ce3af9d58c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:41.258687228 +0000 UTC m=+18.129343169 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs") pod "network-metrics-daemon-x476t" (UID: "8746933a-dcd1-407c-8ebf-6ce3af9d58c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:33.258823 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:33.258712 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:33.258823 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:33.258727 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qdgrn for pod openshift-network-diagnostics/network-check-target-f65gr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:33.258823 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:33.258774 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn podName:89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:41.258758344 +0000 UTC m=+18.129414287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qdgrn" (UniqueName: "kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn") pod "network-check-target-f65gr" (UID: "89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:33.702561 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:33.702483 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:33.702979 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:33.702598 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:34.702280 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:34.702246 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:34.702463 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:34.702378 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:35.701575 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:35.701539 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:35.701993 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:35.701653 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:36.701692 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:36.701654 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:36.702159 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:36.701791 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:37.701771 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:37.701742 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:37.702231 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:37.701861 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:38.701681 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:38.701650 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:38.701861 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:38.701771 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:39.701500 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:39.701466 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:39.701751 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:39.701591 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:40.701989 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:40.701955 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:40.702502 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:40.702119 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:41.313034 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:41.312990 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:41.313254 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:41.313058 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:41.313254 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:41.313183 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:41.313254 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:41.313196 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:41.313254 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:41.313219 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:41.313254 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:41.313228 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qdgrn for pod openshift-network-diagnostics/network-check-target-f65gr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:41.313489 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:41.313261 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs podName:8746933a-dcd1-407c-8ebf-6ce3af9d58c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.313240045 +0000 UTC m=+34.183895980 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs") pod "network-metrics-daemon-x476t" (UID: "8746933a-dcd1-407c-8ebf-6ce3af9d58c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:41.313489 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:41.313280 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn podName:89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.313271263 +0000 UTC m=+34.183927196 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qdgrn" (UniqueName: "kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn") pod "network-check-target-f65gr" (UID: "89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:41.702274 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:41.702189 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:41.702712 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:41.702337 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:42.701484 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:42.701445 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:42.701670 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:42.701597 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:43.703055 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.703030 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:43.703739 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:43.703152 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:43.829668 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.829636 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" event={"ID":"8b022fb0-5dad-478d-8300-571165261cef","Type":"ContainerStarted","Data":"bf2d32d2e5f5aedaebe0dc974f5c047909979cb200bd4122e0805276a0984189"} Apr 21 03:57:43.831052 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.831023 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bgswk" event={"ID":"c7de1195-0825-476e-b0d2-fdf06e76e365","Type":"ContainerStarted","Data":"133754f804b1099a4ef18ca368f67f4c40665e7ee3b2261a499fba7882396258"} Apr 21 03:57:43.832937 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.832912 2580 generic.go:358] "Generic (PLEG): container finished" podID="798393e0-1967-4ff8-bdbd-5debf844db1d" containerID="3ebd2538a3a3d03ed17c7a5e340fb7854e8c0ea86d4a710e762103a83c022a0a" exitCode=0 Apr 21 03:57:43.833028 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.832953 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" event={"ID":"798393e0-1967-4ff8-bdbd-5debf844db1d","Type":"ContainerDied","Data":"3ebd2538a3a3d03ed17c7a5e340fb7854e8c0ea86d4a710e762103a83c022a0a"} Apr 21 03:57:43.834929 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.834905 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vjbp2" event={"ID":"8db5c95c-dcdd-437d-bbd8-b52b4146dc61","Type":"ContainerStarted","Data":"6198915fbc7ede399973aebe227de53e98ded6efb3cc18a97a273bcf3beb7982"} Apr 21 03:57:43.836775 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.836742 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-29b2g" event={"ID":"8fd542f8-4ff1-46b3-821d-17015eac9ffa","Type":"ContainerStarted","Data":"5ddc507dde5be4656576b229c2dd95f6416d1e6e9e135d742f550977d7a12a60"} Apr 21 03:57:43.838350 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.838327 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8dh6m" event={"ID":"753c9f07-568c-4fcd-a6b8-25bada9bac1b","Type":"ContainerStarted","Data":"40d766a90b72a94de35e57da78ef0a3981c82cc664dc5bd263edbff701a55159"} Apr 21 03:57:43.841017 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.840994 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" event={"ID":"3a9c3508-2c05-4c97-851c-899383bc9ca7","Type":"ContainerStarted","Data":"b0dc250ecdfc4ab58177c28c06a545dd839b5b40ca9cd9914f60705f5687d45b"} Apr 21 03:57:43.841109 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.841021 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" event={"ID":"3a9c3508-2c05-4c97-851c-899383bc9ca7","Type":"ContainerStarted","Data":"75bbdabc5e9ffbe05fec985573fb2de11db76f10696dd351b924c2fc9a15fe0e"} Apr 21 03:57:43.841109 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.841036 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" event={"ID":"3a9c3508-2c05-4c97-851c-899383bc9ca7","Type":"ContainerStarted","Data":"adc887a2ecaeb9071b275a7c5dd54fdc4e9859ef28126ce7958b2a6e34ff84ed"} Apr 21 03:57:43.841109 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.841049 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" event={"ID":"3a9c3508-2c05-4c97-851c-899383bc9ca7","Type":"ContainerStarted","Data":"8446f31ea21d0f49b3541838930298582ce4c1231c70a2f62fbec0616000876c"} Apr 21 03:57:43.841109 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.841062 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" event={"ID":"3a9c3508-2c05-4c97-851c-899383bc9ca7","Type":"ContainerStarted","Data":"0242a91f78924f473886ef4fad8e3af92990df82835ca3dc72640d283e960a57"} Apr 21 03:57:43.842756 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.842736 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lrfz2" event={"ID":"aaab1022-57cd-4e71-8136-36d25cbe7fa1","Type":"ContainerStarted","Data":"741f8077b404166853e162e69cd76da295f0aa6b8ab9de782bf2e16ada69d20f"} Apr 21 03:57:43.855274 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.855239 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bgswk" podStartSLOduration=3.870092347 podStartE2EDuration="20.855228773s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:26.081524218 +0000 UTC m=+2.952180152" lastFinishedPulling="2026-04-21 03:57:43.066660634 +0000 UTC m=+19.937316578" observedRunningTime="2026-04-21 03:57:43.854895254 +0000 UTC m=+20.725551204" watchObservedRunningTime="2026-04-21 03:57:43.855228773 +0000 UTC m=+20.725884772" Apr 21 03:57:43.855425 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.855404 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-88.ec2.internal" podStartSLOduration=19.855398333 podStartE2EDuration="19.855398333s" podCreationTimestamp="2026-04-21 03:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:57:28.82765253 +0000 UTC m=+5.698308481" watchObservedRunningTime="2026-04-21 03:57:43.855398333 +0000 UTC m=+20.726054282" Apr 21 03:57:43.868577 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.868538 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vjbp2" podStartSLOduration=3.87867113 podStartE2EDuration="20.868523902s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:26.074875175 +0000 UTC m=+2.945531115" lastFinishedPulling="2026-04-21 03:57:43.06472794 +0000 UTC m=+19.935383887" observedRunningTime="2026-04-21 03:57:43.868168502 +0000 UTC m=+20.738824452" watchObservedRunningTime="2026-04-21 03:57:43.868523902 +0000 UTC m=+20.739179852" Apr 21 03:57:43.886920 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.886863 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-29b2g" podStartSLOduration=3.891474957 podStartE2EDuration="20.886845701s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:26.073572139 +0000 UTC m=+2.944228080" lastFinishedPulling="2026-04-21 03:57:43.06894288 +0000 UTC m=+19.939598824" observedRunningTime="2026-04-21 03:57:43.886179605 +0000 UTC m=+20.756835556" watchObservedRunningTime="2026-04-21 03:57:43.886845701 +0000 UTC m=+20.757501652" Apr 21 03:57:43.920120 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.920043 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8dh6m" podStartSLOduration=3.967968093 podStartE2EDuration="20.920024105s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:26.08489648 +0000 UTC m=+2.955552411" lastFinishedPulling="2026-04-21 03:57:43.036952481 +0000 UTC m=+19.907608423" observedRunningTime="2026-04-21 03:57:43.919698806 +0000 UTC m=+20.790354757" watchObservedRunningTime="2026-04-21 03:57:43.920024105 +0000 UTC m=+20.790680056" Apr 21 03:57:43.933470 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:43.933418 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lrfz2" podStartSLOduration=12.093805061 podStartE2EDuration="20.933401779s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:26.078502858 +0000 UTC m=+2.949158786" lastFinishedPulling="2026-04-21 03:57:34.918099567 +0000 UTC m=+11.788755504" observedRunningTime="2026-04-21 03:57:43.933261097 +0000 UTC m=+20.803917048" watchObservedRunningTime="2026-04-21 03:57:43.933401779 +0000 UTC m=+20.804057732" Apr 21 03:57:44.310862 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:44.310681 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 03:57:44.645993 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:44.645821 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T03:57:44.310858509Z","UUID":"877771a5-8a77-4c29-b026-36e37ea5eeda","Handler":null,"Name":"","Endpoint":""} Apr 21 03:57:44.649269 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:44.648988 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 03:57:44.649269 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:44.649025 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 03:57:44.701747 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:44.701712 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:44.701947 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:44.701859 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:44.848413 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:44.848312 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" event={"ID":"3a9c3508-2c05-4c97-851c-899383bc9ca7","Type":"ContainerStarted","Data":"7e7634e8120895705299216951d7fa70c143e0e39e9bffdc9bc53c1ffbab481c"} Apr 21 03:57:44.850194 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:44.850166 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" event={"ID":"8b022fb0-5dad-478d-8300-571165261cef","Type":"ContainerStarted","Data":"cd21c882e40eda176b0cda4b3591ae9353b3905b4a020ec77f3646fe199ada61"} Apr 21 03:57:44.851816 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:44.851709 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m8ps6" event={"ID":"3311248f-8a11-446e-8162-2933f3299d2d","Type":"ContainerStarted","Data":"7801ca450c4bff50c0d23ea610227746dfeb70cdc47be5730f6fd49272a1e5d0"} Apr 21 03:57:44.866391 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:44.866339 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m8ps6" podStartSLOduration=4.886393812 podStartE2EDuration="21.866324441s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:26.084797846 +0000 UTC m=+2.955453777" lastFinishedPulling="2026-04-21 03:57:43.064728466 +0000 UTC m=+19.935384406" observedRunningTime="2026-04-21 03:57:44.865264201 +0000 UTC m=+21.735920152" watchObservedRunningTime="2026-04-21 03:57:44.866324441 +0000 UTC m=+21.736980392" Apr 21 03:57:45.702059 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:45.702023 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:45.702281 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:45.702151 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:45.855288 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:45.855251 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" event={"ID":"8b022fb0-5dad-478d-8300-571165261cef","Type":"ContainerStarted","Data":"59ba46cd204d1ed8cb160354ffcfc5155749d50773b72500ecabeca71c2e33f2"} Apr 21 03:57:45.871130 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:45.871067 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q5krp" podStartSLOduration=3.864722594 podStartE2EDuration="22.871054397s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:26.084815737 +0000 UTC m=+2.955471682" lastFinishedPulling="2026-04-21 03:57:45.091147557 +0000 UTC m=+21.961803485" observedRunningTime="2026-04-21 03:57:45.870667138 +0000 UTC m=+22.741323079" watchObservedRunningTime="2026-04-21 03:57:45.871054397 +0000 UTC m=+22.741710346" Apr 21 03:57:46.701792 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:46.701758 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:46.701954 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:46.701907 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:46.860098 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:46.860047 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" event={"ID":"3a9c3508-2c05-4c97-851c-899383bc9ca7","Type":"ContainerStarted","Data":"c993979e5dea61e548d48eed879f7cd41b8e725e98ea093f56a537070b9fa3df"} Apr 21 03:57:47.702054 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:47.702017 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:47.702223 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:47.702139 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:47.983633 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:47.983598 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:47.984198 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:47.984179 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:48.702156 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.702130 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:48.702336 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:48.702227 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:48.788266 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.788241 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:48.788787 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.788759 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8dh6m" Apr 21 03:57:48.864489 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.864462 2580 generic.go:358] "Generic (PLEG): container finished" podID="798393e0-1967-4ff8-bdbd-5debf844db1d" containerID="d80fc5b89dceef0a8e056af258a8d5918e664c8b952842fe2630de73b773c991" exitCode=0 Apr 21 03:57:48.864605 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.864549 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" event={"ID":"798393e0-1967-4ff8-bdbd-5debf844db1d","Type":"ContainerDied","Data":"d80fc5b89dceef0a8e056af258a8d5918e664c8b952842fe2630de73b773c991"} Apr 21 03:57:48.867674 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.867647 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" event={"ID":"3a9c3508-2c05-4c97-851c-899383bc9ca7","Type":"ContainerStarted","Data":"faf3cb60fbb421e3b81a9aadad31d210f43e8a6922255b74fad1a1d47676f1a6"} Apr 21 03:57:48.868129 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.868037 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:48.868129 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.868062 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:48.868129 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.868076 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:48.882669 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.882642 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:48.882748 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.882738 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:57:48.912287 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:48.912251 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" podStartSLOduration=8.462419389 podStartE2EDuration="25.912240674s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:26.081848037 +0000 UTC m=+2.952503970" lastFinishedPulling="2026-04-21 03:57:43.531669312 +0000 UTC m=+20.402325255" observedRunningTime="2026-04-21 03:57:48.911382271 +0000 UTC m=+25.782038220" watchObservedRunningTime="2026-04-21 03:57:48.912240674 +0000 UTC m=+25.782896625" Apr 21 03:57:49.701986 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:49.701960 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:49.702454 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:49.702108 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:49.871695 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:49.871618 2580 generic.go:358] "Generic (PLEG): container finished" podID="798393e0-1967-4ff8-bdbd-5debf844db1d" containerID="41c5586559682ecc982541312284ea7732f1248e7c5f4edfa6e8474b0a5a3c6c" exitCode=0 Apr 21 03:57:49.871800 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:49.871700 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" event={"ID":"798393e0-1967-4ff8-bdbd-5debf844db1d","Type":"ContainerDied","Data":"41c5586559682ecc982541312284ea7732f1248e7c5f4edfa6e8474b0a5a3c6c"} Apr 21 03:57:50.005290 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:50.005264 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x476t"] Apr 21 03:57:50.005403 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:50.005369 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:50.005480 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:50.005463 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:50.008001 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:50.007979 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-f65gr"] Apr 21 03:57:50.008118 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:50.008105 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:50.008221 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:50.008200 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:50.875583 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:50.875546 2580 generic.go:358] "Generic (PLEG): container finished" podID="798393e0-1967-4ff8-bdbd-5debf844db1d" containerID="bcc3552670212c613a6579dd28c3c71798511f1b8b84111a8c567e41abe9a4da" exitCode=0 Apr 21 03:57:50.875944 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:50.875632 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" event={"ID":"798393e0-1967-4ff8-bdbd-5debf844db1d","Type":"ContainerDied","Data":"bcc3552670212c613a6579dd28c3c71798511f1b8b84111a8c567e41abe9a4da"} Apr 21 03:57:51.702251 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:51.702220 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:51.702410 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:51.702220 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:51.702410 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:51.702320 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:51.702410 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:51.702384 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:53.702895 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:53.702859 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:53.703565 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:53.702953 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:53.703565 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:53.702983 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:53.703565 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:53.703067 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:55.702020 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:55.701981 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:55.702477 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:55.702028 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:55.702477 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:55.702125 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f65gr" podUID="89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1" Apr 21 03:57:55.702477 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:55.702237 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x476t" podUID="8746933a-dcd1-407c-8ebf-6ce3af9d58c0" Apr 21 03:57:55.963564 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:55.963533 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-88.ec2.internal" event="NodeReady" Apr 21 03:57:55.963747 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:55.963666 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 03:57:56.012159 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.012129 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-29n5x"] Apr 21 03:57:56.032843 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.032815 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tj58v"] Apr 21 03:57:56.033018 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.032986 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.036037 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.035702 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 03:57:56.036037 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.035790 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lrpqk\"" Apr 21 03:57:56.036037 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.035982 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 03:57:56.048362 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.048341 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tj58v"] Apr 21 03:57:56.048521 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.048367 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-29n5x"] Apr 21 03:57:56.048521 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.048465 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:57:56.051102 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.050649 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 03:57:56.051102 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.050659 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 03:57:56.051102 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.050687 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 03:57:56.051102 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.050929 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4vtdx\"" Apr 21 03:57:56.121562 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.121532 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/772f98a5-784d-4ae7-9617-a1b4f77424fb-config-volume\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.121713 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.121563 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbbx\" (UniqueName: \"kubernetes.io/projected/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-kube-api-access-cqbbx\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:57:56.121713 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.121610 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/772f98a5-784d-4ae7-9617-a1b4f77424fb-tmp-dir\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.121713 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.121626 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7j6z\" (UniqueName: \"kubernetes.io/projected/772f98a5-784d-4ae7-9617-a1b4f77424fb-kube-api-access-f7j6z\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.121713 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.121677 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:57:56.121713 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.121700 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.222903 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.222832 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:57:56.222903 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.222868 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.223293 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.222913 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/772f98a5-784d-4ae7-9617-a1b4f77424fb-config-volume\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.223293 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.222940 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbbx\" (UniqueName: \"kubernetes.io/projected/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-kube-api-access-cqbbx\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:57:56.223293 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.222985 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/772f98a5-784d-4ae7-9617-a1b4f77424fb-tmp-dir\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.223293 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.223009 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7j6z\" (UniqueName: \"kubernetes.io/projected/772f98a5-784d-4ae7-9617-a1b4f77424fb-kube-api-access-f7j6z\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.223293 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.223021 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:56.223293 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.223115 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert podName:c3592d56-3229-4a9f-8d19-2b45ed61d4c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:56.723092224 +0000 UTC m=+33.593748170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert") pod "ingress-canary-tj58v" (UID: "c3592d56-3229-4a9f-8d19-2b45ed61d4c0") : secret "canary-serving-cert" not found Apr 21 03:57:56.223293 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.223022 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:56.223293 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.223206 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls podName:772f98a5-784d-4ae7-9617-a1b4f77424fb nodeName:}" failed. No retries permitted until 2026-04-21 03:57:56.723187845 +0000 UTC m=+33.593843787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls") pod "dns-default-29n5x" (UID: "772f98a5-784d-4ae7-9617-a1b4f77424fb") : secret "dns-default-metrics-tls" not found Apr 21 03:57:56.223666 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.223495 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/772f98a5-784d-4ae7-9617-a1b4f77424fb-tmp-dir\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.223666 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.223539 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/772f98a5-784d-4ae7-9617-a1b4f77424fb-config-volume\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.233029 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.232875 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7j6z\" (UniqueName: \"kubernetes.io/projected/772f98a5-784d-4ae7-9617-a1b4f77424fb-kube-api-access-f7j6z\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.233178 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.233159 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbbx\" (UniqueName: \"kubernetes.io/projected/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-kube-api-access-cqbbx\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:57:56.347246 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.347209 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl"] Apr 21 03:57:56.370491 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.370450 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl"] Apr 21 03:57:56.370670 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.370500 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:57:56.372954 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.372933 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 03:57:56.373179 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.373159 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:56.373396 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.373379 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-dzfww\"" Apr 21 03:57:56.374483 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.374465 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:56.411457 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.411426 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-57d784794-br7xh"] Apr 21 03:57:56.426339 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.426319 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.428674 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.428655 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 03:57:56.428786 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.428759 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 03:57:56.428999 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.428981 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 03:57:56.428999 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.428984 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-bz9p5\"" Apr 21 03:57:56.429155 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.429016 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 03:57:56.429155 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.428988 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 03:57:56.429155 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.429047 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 03:57:56.439097 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.439027 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-57d784794-br7xh"] Apr 21 03:57:56.503390 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.503305 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn"] Apr 21 03:57:56.516825 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.516803 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn"] Apr 21 03:57:56.516952 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.516915 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.519191 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.519168 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-hhcc9\"" Apr 21 03:57:56.519307 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.519234 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:56.519367 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.519237 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 03:57:56.519545 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.519528 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 03:57:56.519545 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.519539 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:56.525033 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.525013 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-default-certificate\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.525152 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.525050 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp4m7\" (UniqueName: \"kubernetes.io/projected/f4c8739a-c60c-42cd-bc9f-8648b4999008-kube-api-access-zp4m7\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.525152 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.525120 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.525152 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.525148 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:57:56.525301 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.525174 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vnh6\" (UniqueName: \"kubernetes.io/projected/5db70efe-7a8f-4630-ba85-c061199340f6-kube-api-access-7vnh6\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:57:56.525301 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.525220 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.525301 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.525262 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-stats-auth\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.625868 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.625827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zp4m7\" (UniqueName: \"kubernetes.io/projected/f4c8739a-c60c-42cd-bc9f-8648b4999008-kube-api-access-zp4m7\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.626004 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.625889 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.626004 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.625914 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da84618b-ba61-4e8a-a07c-4ccc02317ed5-config\") pod \"service-ca-operator-d6fc45fc5-7f2dn\" (UID: \"da84618b-ba61-4e8a-a07c-4ccc02317ed5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.626004 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.625938 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:57:56.626004 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.625956 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7zgt\" (UniqueName: \"kubernetes.io/projected/da84618b-ba61-4e8a-a07c-4ccc02317ed5-kube-api-access-v7zgt\") pod \"service-ca-operator-d6fc45fc5-7f2dn\" (UID: \"da84618b-ba61-4e8a-a07c-4ccc02317ed5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.626004 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.625976 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vnh6\" (UniqueName: \"kubernetes.io/projected/5db70efe-7a8f-4630-ba85-c061199340f6-kube-api-access-7vnh6\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:57:56.626330 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.626005 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.626330 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.626101 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:57:56.626330 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.626101 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:57:56.626330 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.626148 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-stats-auth\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.626330 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.626150 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls podName:5db70efe-7a8f-4630-ba85-c061199340f6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.126135392 +0000 UTC m=+33.996791325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gxtnl" (UID: "5db70efe-7a8f-4630-ba85-c061199340f6") : secret "samples-operator-tls" not found Apr 21 03:57:56.626330 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.626237 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.126218259 +0000 UTC m=+33.996874202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : configmap references non-existent config key: service-ca.crt Apr 21 03:57:56.626330 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.626259 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.126246774 +0000 UTC m=+33.996902746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : secret "router-metrics-certs-default" not found Apr 21 03:57:56.626330 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.626319 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-default-certificate\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.626641 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.626359 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da84618b-ba61-4e8a-a07c-4ccc02317ed5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-7f2dn\" (UID: \"da84618b-ba61-4e8a-a07c-4ccc02317ed5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.628678 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.628661 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-default-certificate\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.628729 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.628717 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-stats-auth\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.640525 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.640499 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vnh6\" (UniqueName: \"kubernetes.io/projected/5db70efe-7a8f-4630-ba85-c061199340f6-kube-api-access-7vnh6\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:57:56.640906 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.640891 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp4m7\" (UniqueName: \"kubernetes.io/projected/f4c8739a-c60c-42cd-bc9f-8648b4999008-kube-api-access-zp4m7\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:56.727023 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.726989 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:57:56.727431 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.727036 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da84618b-ba61-4e8a-a07c-4ccc02317ed5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-7f2dn\" (UID: \"da84618b-ba61-4e8a-a07c-4ccc02317ed5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.727431 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.727156 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:56.727431 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.727215 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert podName:c3592d56-3229-4a9f-8d19-2b45ed61d4c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.727199839 +0000 UTC m=+34.597855767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert") pod "ingress-canary-tj58v" (UID: "c3592d56-3229-4a9f-8d19-2b45ed61d4c0") : secret "canary-serving-cert" not found Apr 21 03:57:56.727431 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.727157 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:56.727431 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.727221 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:56.727431 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.727331 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da84618b-ba61-4e8a-a07c-4ccc02317ed5-config\") pod \"service-ca-operator-d6fc45fc5-7f2dn\" (UID: \"da84618b-ba61-4e8a-a07c-4ccc02317ed5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.727431 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:56.727361 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls podName:772f98a5-784d-4ae7-9617-a1b4f77424fb nodeName:}" failed. No retries permitted until 2026-04-21 03:57:57.727343775 +0000 UTC m=+34.597999708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls") pod "dns-default-29n5x" (UID: "772f98a5-784d-4ae7-9617-a1b4f77424fb") : secret "dns-default-metrics-tls" not found Apr 21 03:57:56.727431 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.727417 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7zgt\" (UniqueName: \"kubernetes.io/projected/da84618b-ba61-4e8a-a07c-4ccc02317ed5-kube-api-access-v7zgt\") pod \"service-ca-operator-d6fc45fc5-7f2dn\" (UID: \"da84618b-ba61-4e8a-a07c-4ccc02317ed5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.727860 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.727825 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da84618b-ba61-4e8a-a07c-4ccc02317ed5-config\") pod \"service-ca-operator-d6fc45fc5-7f2dn\" (UID: \"da84618b-ba61-4e8a-a07c-4ccc02317ed5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.729156 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.729140 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da84618b-ba61-4e8a-a07c-4ccc02317ed5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-7f2dn\" (UID: \"da84618b-ba61-4e8a-a07c-4ccc02317ed5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.734656 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.734633 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7zgt\" (UniqueName: \"kubernetes.io/projected/da84618b-ba61-4e8a-a07c-4ccc02317ed5-kube-api-access-v7zgt\") pod \"service-ca-operator-d6fc45fc5-7f2dn\" (UID: \"da84618b-ba61-4e8a-a07c-4ccc02317ed5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.825958 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.825937 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" Apr 21 03:57:56.982200 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:56.982173 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn"] Apr 21 03:57:56.986097 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:57:56.986053 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda84618b_ba61_4e8a_a07c_4ccc02317ed5.slice/crio-070ed551448d159afea8f57b8d737359511bb62d79d191015ca1c4afcc4cc968 WatchSource:0}: Error finding container 070ed551448d159afea8f57b8d737359511bb62d79d191015ca1c4afcc4cc968: Status 404 returned error can't find the container with id 070ed551448d159afea8f57b8d737359511bb62d79d191015ca1c4afcc4cc968 Apr 21 03:57:57.129709 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.129680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:57.129831 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.129718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:57:57.129831 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.129740 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:57.129925 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.129847 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:57:57.129925 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.129850 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:57:57.129925 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.129847 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:58.129828631 +0000 UTC m=+35.000484582 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : configmap references non-existent config key: service-ca.crt Apr 21 03:57:57.129925 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.129921 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:58.129907904 +0000 UTC m=+35.000563835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : secret "router-metrics-certs-default" not found Apr 21 03:57:57.130122 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.129932 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls podName:5db70efe-7a8f-4630-ba85-c061199340f6 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:58.12992571 +0000 UTC m=+35.000581638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gxtnl" (UID: "5db70efe-7a8f-4630-ba85-c061199340f6") : secret "samples-operator-tls" not found Apr 21 03:57:57.332102 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.332053 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:57.332273 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.332141 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:57.332273 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.332189 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:57.332273 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.332252 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:57.332273 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.332266 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:57.332273 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.332275 2580 projected.go:194] Error preparing data for projected volume kube-api-access-qdgrn for pod openshift-network-diagnostics/network-check-target-f65gr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:57.332425 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.332253 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs podName:8746933a-dcd1-407c-8ebf-6ce3af9d58c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:29.332238726 +0000 UTC m=+66.202894658 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs") pod "network-metrics-daemon-x476t" (UID: "8746933a-dcd1-407c-8ebf-6ce3af9d58c0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:57.332425 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.332325 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn podName:89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:29.33231163 +0000 UTC m=+66.202967558 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qdgrn" (UniqueName: "kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn") pod "network-check-target-f65gr" (UID: "89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:57.701482 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.701444 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:57:57.701713 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.701678 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:57:57.703542 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.703517 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tn8dn\"" Apr 21 03:57:57.703657 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.703569 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:57:57.703657 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.703606 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:57:57.703846 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.703827 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rcf6j\"" Apr 21 03:57:57.703962 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.703880 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:57:57.735044 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.735017 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:57:57.735389 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.735061 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:57.735389 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.735185 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:57.735389 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.735250 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert podName:c3592d56-3229-4a9f-8d19-2b45ed61d4c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:59.735231056 +0000 UTC m=+36.605886993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert") pod "ingress-canary-tj58v" (UID: "c3592d56-3229-4a9f-8d19-2b45ed61d4c0") : secret "canary-serving-cert" not found Apr 21 03:57:57.735389 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.735264 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:57.735389 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:57.735327 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls podName:772f98a5-784d-4ae7-9617-a1b4f77424fb nodeName:}" failed. No retries permitted until 2026-04-21 03:57:59.735311604 +0000 UTC m=+36.605967547 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls") pod "dns-default-29n5x" (UID: "772f98a5-784d-4ae7-9617-a1b4f77424fb") : secret "dns-default-metrics-tls" not found Apr 21 03:57:57.890187 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.890143 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" event={"ID":"da84618b-ba61-4e8a-a07c-4ccc02317ed5","Type":"ContainerStarted","Data":"070ed551448d159afea8f57b8d737359511bb62d79d191015ca1c4afcc4cc968"} Apr 21 03:57:57.892990 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.892964 2580 generic.go:358] "Generic (PLEG): container finished" podID="798393e0-1967-4ff8-bdbd-5debf844db1d" containerID="40abf03daf396b753128ae9f08e99752b17b288b5c51cc99ed1eedf663acbe20" exitCode=0 Apr 21 03:57:57.893132 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:57.893005 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" event={"ID":"798393e0-1967-4ff8-bdbd-5debf844db1d","Type":"ContainerDied","Data":"40abf03daf396b753128ae9f08e99752b17b288b5c51cc99ed1eedf663acbe20"} Apr 21 03:57:58.140122 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:58.139891 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:58.140246 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:58.140013 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:57:58.140246 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:58.140229 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:57:58.140343 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:58.140250 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:00.140230114 +0000 UTC m=+37.010886048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : secret "router-metrics-certs-default" not found Apr 21 03:57:58.140343 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:58.140285 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:57:58.140343 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:58.140310 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:00.140300873 +0000 UTC m=+37.010956801 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : configmap references non-existent config key: service-ca.crt Apr 21 03:57:58.140466 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:58.140401 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:57:58.140466 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:58.140433 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls podName:5db70efe-7a8f-4630-ba85-c061199340f6 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:00.140423281 +0000 UTC m=+37.011079215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gxtnl" (UID: "5db70efe-7a8f-4630-ba85-c061199340f6") : secret "samples-operator-tls" not found Apr 21 03:57:58.896906 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:58.896874 2580 generic.go:358] "Generic (PLEG): container finished" podID="798393e0-1967-4ff8-bdbd-5debf844db1d" containerID="94fffdfa549714b6859a1c83af51f028677f8aebdbb5afecf08ae58896862dab" exitCode=0 Apr 21 03:57:58.897278 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:58.896920 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" event={"ID":"798393e0-1967-4ff8-bdbd-5debf844db1d","Type":"ContainerDied","Data":"94fffdfa549714b6859a1c83af51f028677f8aebdbb5afecf08ae58896862dab"} Apr 21 03:57:59.754650 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:59.754616 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:57:59.754650 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:59.754655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:57:59.754830 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:59.754763 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:57:59.754830 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:59.754781 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:57:59.754830 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:59.754821 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert podName:c3592d56-3229-4a9f-8d19-2b45ed61d4c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:03.754806044 +0000 UTC m=+40.625461972 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert") pod "ingress-canary-tj58v" (UID: "c3592d56-3229-4a9f-8d19-2b45ed61d4c0") : secret "canary-serving-cert" not found Apr 21 03:57:59.754967 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:57:59.754835 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls podName:772f98a5-784d-4ae7-9617-a1b4f77424fb nodeName:}" failed. No retries permitted until 2026-04-21 03:58:03.754829403 +0000 UTC m=+40.625485330 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls") pod "dns-default-29n5x" (UID: "772f98a5-784d-4ae7-9617-a1b4f77424fb") : secret "dns-default-metrics-tls" not found Apr 21 03:57:59.903869 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:59.903780 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" event={"ID":"798393e0-1967-4ff8-bdbd-5debf844db1d","Type":"ContainerStarted","Data":"6177dd08a0ab83791ced1361c1e821e6662d3950103673d5ef106e228ce446f0"} Apr 21 03:57:59.905542 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:59.905511 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" event={"ID":"da84618b-ba61-4e8a-a07c-4ccc02317ed5","Type":"ContainerStarted","Data":"886f3d0f666feda9b357870b251d000ec41dc03341327d27a29bf639dbfd7526"} Apr 21 03:57:59.924573 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:59.924535 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hjl4c" podStartSLOduration=6.181601874 podStartE2EDuration="36.924524016s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:57:26.07982841 +0000 UTC m=+2.950484339" lastFinishedPulling="2026-04-21 03:57:56.822750549 +0000 UTC m=+33.693406481" observedRunningTime="2026-04-21 03:57:59.924342521 +0000 UTC m=+36.794998472" watchObservedRunningTime="2026-04-21 03:57:59.924524016 +0000 UTC m=+36.795179967" Apr 21 03:57:59.940504 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:57:59.940463 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" podStartSLOduration=1.7376764279999999 podStartE2EDuration="3.940451904s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="2026-04-21 03:57:56.987988255 +0000 UTC m=+33.858644188" lastFinishedPulling="2026-04-21 03:57:59.190763734 +0000 UTC m=+36.061419664" observedRunningTime="2026-04-21 03:57:59.939410547 +0000 UTC m=+36.810066496" watchObservedRunningTime="2026-04-21 03:57:59.940451904 +0000 UTC m=+36.811107854" Apr 21 03:58:00.157768 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:00.157663 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:00.157768 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:00.157714 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:58:00.157768 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:00.157749 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:00.158014 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:00.157857 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:58:00.158014 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:00.157877 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:04.157855328 +0000 UTC m=+41.028511267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : configmap references non-existent config key: service-ca.crt Apr 21 03:58:00.158014 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:00.157897 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:58:00.158014 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:00.157901 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls podName:5db70efe-7a8f-4630-ba85-c061199340f6 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:04.157891121 +0000 UTC m=+41.028547051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gxtnl" (UID: "5db70efe-7a8f-4630-ba85-c061199340f6") : secret "samples-operator-tls" not found Apr 21 03:58:00.158014 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:00.157968 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:04.157945991 +0000 UTC m=+41.028601922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : secret "router-metrics-certs-default" not found Apr 21 03:58:01.720057 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:01.720024 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f"] Apr 21 03:58:01.755454 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:01.755427 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f"] Apr 21 03:58:01.755577 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:01.755532 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f" Apr 21 03:58:01.757303 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:01.757280 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 03:58:01.757423 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:01.757309 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 03:58:01.757423 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:01.757317 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-vwq56\"" Apr 21 03:58:01.872511 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:01.872478 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtx62\" (UniqueName: \"kubernetes.io/projected/22bf5a92-f511-48ff-86be-263716a64584-kube-api-access-mtx62\") pod \"migrator-74bb7799d9-zkx9f\" (UID: \"22bf5a92-f511-48ff-86be-263716a64584\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f" Apr 21 03:58:01.973259 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:01.973190 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtx62\" (UniqueName: \"kubernetes.io/projected/22bf5a92-f511-48ff-86be-263716a64584-kube-api-access-mtx62\") pod \"migrator-74bb7799d9-zkx9f\" (UID: \"22bf5a92-f511-48ff-86be-263716a64584\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f" Apr 21 03:58:01.981676 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:01.981654 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtx62\" (UniqueName: \"kubernetes.io/projected/22bf5a92-f511-48ff-86be-263716a64584-kube-api-access-mtx62\") pod \"migrator-74bb7799d9-zkx9f\" (UID: \"22bf5a92-f511-48ff-86be-263716a64584\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f" Apr 21 03:58:02.064156 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.064117 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f" Apr 21 03:58:02.201548 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.201509 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f"] Apr 21 03:58:02.206858 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:02.206828 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22bf5a92_f511_48ff_86be_263716a64584.slice/crio-67646e808e6cb60ce23ad2ffa95c59030b9ae4c0fdede3aae1f015f2b73f108c WatchSource:0}: Error finding container 67646e808e6cb60ce23ad2ffa95c59030b9ae4c0fdede3aae1f015f2b73f108c: Status 404 returned error can't find the container with id 67646e808e6cb60ce23ad2ffa95c59030b9ae4c0fdede3aae1f015f2b73f108c Apr 21 03:58:02.824777 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.824743 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2qfpj"] Apr 21 03:58:02.860868 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.860822 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2qfpj"] Apr 21 03:58:02.861017 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.860909 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:02.864858 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.864836 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 03:58:02.864973 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.864843 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 03:58:02.865222 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.865209 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 03:58:02.865276 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.865241 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 03:58:02.865690 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.865678 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-26jtx\"" Apr 21 03:58:02.912873 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.912845 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f" event={"ID":"22bf5a92-f511-48ff-86be-263716a64584","Type":"ContainerStarted","Data":"67646e808e6cb60ce23ad2ffa95c59030b9ae4c0fdede3aae1f015f2b73f108c"} Apr 21 03:58:02.942822 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.942799 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vjbp2_8db5c95c-dcdd-437d-bbd8-b52b4146dc61/dns-node-resolver/0.log" Apr 21 03:58:02.982482 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.982447 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr9vw\" (UniqueName: \"kubernetes.io/projected/2afab9d3-3310-4373-b2c3-243486539ac8-kube-api-access-tr9vw\") pod \"service-ca-865cb79987-2qfpj\" (UID: \"2afab9d3-3310-4373-b2c3-243486539ac8\") " pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:02.982609 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.982499 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2afab9d3-3310-4373-b2c3-243486539ac8-signing-cabundle\") pod \"service-ca-865cb79987-2qfpj\" (UID: \"2afab9d3-3310-4373-b2c3-243486539ac8\") " pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:02.982609 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:02.982556 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2afab9d3-3310-4373-b2c3-243486539ac8-signing-key\") pod \"service-ca-865cb79987-2qfpj\" (UID: \"2afab9d3-3310-4373-b2c3-243486539ac8\") " pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:03.083524 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.083438 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tr9vw\" (UniqueName: \"kubernetes.io/projected/2afab9d3-3310-4373-b2c3-243486539ac8-kube-api-access-tr9vw\") pod \"service-ca-865cb79987-2qfpj\" (UID: \"2afab9d3-3310-4373-b2c3-243486539ac8\") " pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:03.083524 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.083498 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2afab9d3-3310-4373-b2c3-243486539ac8-signing-cabundle\") pod \"service-ca-865cb79987-2qfpj\" (UID: \"2afab9d3-3310-4373-b2c3-243486539ac8\") " pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:03.083729 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.083558 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2afab9d3-3310-4373-b2c3-243486539ac8-signing-key\") pod \"service-ca-865cb79987-2qfpj\" (UID: \"2afab9d3-3310-4373-b2c3-243486539ac8\") " pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:03.084352 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.084317 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2afab9d3-3310-4373-b2c3-243486539ac8-signing-cabundle\") pod \"service-ca-865cb79987-2qfpj\" (UID: \"2afab9d3-3310-4373-b2c3-243486539ac8\") " pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:03.091200 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.091166 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr9vw\" (UniqueName: \"kubernetes.io/projected/2afab9d3-3310-4373-b2c3-243486539ac8-kube-api-access-tr9vw\") pod \"service-ca-865cb79987-2qfpj\" (UID: \"2afab9d3-3310-4373-b2c3-243486539ac8\") " pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:03.096649 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.096630 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2afab9d3-3310-4373-b2c3-243486539ac8-signing-key\") pod \"service-ca-865cb79987-2qfpj\" (UID: \"2afab9d3-3310-4373-b2c3-243486539ac8\") " pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:03.170018 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.169987 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2qfpj" Apr 21 03:58:03.288157 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.288127 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2qfpj"] Apr 21 03:58:03.538043 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:03.538011 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2afab9d3_3310_4373_b2c3_243486539ac8.slice/crio-a97492ee2e665d6224e33ff07161c3475745d3bce0a5465e9d54b1076983e8ab WatchSource:0}: Error finding container a97492ee2e665d6224e33ff07161c3475745d3bce0a5465e9d54b1076983e8ab: Status 404 returned error can't find the container with id a97492ee2e665d6224e33ff07161c3475745d3bce0a5465e9d54b1076983e8ab Apr 21 03:58:03.737251 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.737223 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lrfz2_aaab1022-57cd-4e71-8136-36d25cbe7fa1/node-ca/0.log" Apr 21 03:58:03.788709 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.788635 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:58:03.788709 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.788673 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:58:03.788890 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:03.788772 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:03.788890 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:03.788834 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert podName:c3592d56-3229-4a9f-8d19-2b45ed61d4c0 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:11.788817574 +0000 UTC m=+48.659473502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert") pod "ingress-canary-tj58v" (UID: "c3592d56-3229-4a9f-8d19-2b45ed61d4c0") : secret "canary-serving-cert" not found Apr 21 03:58:03.788976 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:03.788897 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:03.788976 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:03.788939 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls podName:772f98a5-784d-4ae7-9617-a1b4f77424fb nodeName:}" failed. No retries permitted until 2026-04-21 03:58:11.788927313 +0000 UTC m=+48.659583241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls") pod "dns-default-29n5x" (UID: "772f98a5-784d-4ae7-9617-a1b4f77424fb") : secret "dns-default-metrics-tls" not found Apr 21 03:58:03.916123 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.916091 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2qfpj" event={"ID":"2afab9d3-3310-4373-b2c3-243486539ac8","Type":"ContainerStarted","Data":"7c3accfb166a3f9af4a32a01fe307d101b4bc292696410f1b61b05afd226606f"} Apr 21 03:58:03.916123 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.916125 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2qfpj" event={"ID":"2afab9d3-3310-4373-b2c3-243486539ac8","Type":"ContainerStarted","Data":"a97492ee2e665d6224e33ff07161c3475745d3bce0a5465e9d54b1076983e8ab"} Apr 21 03:58:03.936184 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:03.936057 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-2qfpj" podStartSLOduration=1.936021558 podStartE2EDuration="1.936021558s" podCreationTimestamp="2026-04-21 03:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:58:03.935443348 +0000 UTC m=+40.806099298" watchObservedRunningTime="2026-04-21 03:58:03.936021558 +0000 UTC m=+40.806677508" Apr 21 03:58:04.192214 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:04.192169 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:04.192381 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:04.192229 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:58:04.192381 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:04.192266 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:04.192381 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:04.192335 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:12.192309699 +0000 UTC m=+49.062965647 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : configmap references non-existent config key: service-ca.crt Apr 21 03:58:04.192381 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:04.192373 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:58:04.192381 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:04.192381 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 03:58:04.192557 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:04.192419 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls podName:5db70efe-7a8f-4630-ba85-c061199340f6 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:12.192409807 +0000 UTC m=+49.063065739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gxtnl" (UID: "5db70efe-7a8f-4630-ba85-c061199340f6") : secret "samples-operator-tls" not found Apr 21 03:58:04.192557 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:04.192432 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs podName:f4c8739a-c60c-42cd-bc9f-8648b4999008 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:12.192426472 +0000 UTC m=+49.063082399 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs") pod "router-default-57d784794-br7xh" (UID: "f4c8739a-c60c-42cd-bc9f-8648b4999008") : secret "router-metrics-certs-default" not found Apr 21 03:58:04.920100 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:04.920047 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f" event={"ID":"22bf5a92-f511-48ff-86be-263716a64584","Type":"ContainerStarted","Data":"de2e3735e2752972c8d031bda3184abe83ab5b3c22957867885d4bc8cb2de25a"} Apr 21 03:58:04.920500 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:04.920107 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f" event={"ID":"22bf5a92-f511-48ff-86be-263716a64584","Type":"ContainerStarted","Data":"9132b4d741414634aff5a0e71fccb9bda335957ce6125aa75fb9ee36f2c070fd"} Apr 21 03:58:04.936521 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:04.936466 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zkx9f" podStartSLOduration=2.2617940340000002 podStartE2EDuration="3.9364546s" podCreationTimestamp="2026-04-21 03:58:01 +0000 UTC" firstStartedPulling="2026-04-21 03:58:02.20873552 +0000 UTC m=+39.079391447" lastFinishedPulling="2026-04-21 03:58:03.883396079 +0000 UTC m=+40.754052013" observedRunningTime="2026-04-21 03:58:04.936316244 +0000 UTC m=+41.806972195" watchObservedRunningTime="2026-04-21 03:58:04.9364546 +0000 UTC m=+41.807110550" Apr 21 03:58:11.855878 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:11.855838 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:58:11.855878 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:11.855886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:58:11.858187 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:11.858161 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/772f98a5-784d-4ae7-9617-a1b4f77424fb-metrics-tls\") pod \"dns-default-29n5x\" (UID: \"772f98a5-784d-4ae7-9617-a1b4f77424fb\") " pod="openshift-dns/dns-default-29n5x" Apr 21 03:58:11.858284 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:11.858259 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3592d56-3229-4a9f-8d19-2b45ed61d4c0-cert\") pod \"ingress-canary-tj58v\" (UID: \"c3592d56-3229-4a9f-8d19-2b45ed61d4c0\") " pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:58:11.945458 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:11.945430 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-29n5x" Apr 21 03:58:11.959328 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:11.959302 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tj58v" Apr 21 03:58:12.070412 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.070381 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-29n5x"] Apr 21 03:58:12.073888 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:12.073845 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772f98a5_784d_4ae7_9617_a1b4f77424fb.slice/crio-5425d925f26b27924c0ecd356a46500cdceccefb9d87c825f684913988c1889c WatchSource:0}: Error finding container 5425d925f26b27924c0ecd356a46500cdceccefb9d87c825f684913988c1889c: Status 404 returned error can't find the container with id 5425d925f26b27924c0ecd356a46500cdceccefb9d87c825f684913988c1889c Apr 21 03:58:12.085535 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.085499 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tj58v"] Apr 21 03:58:12.088305 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:12.088282 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3592d56_3229_4a9f_8d19_2b45ed61d4c0.slice/crio-b5984f9afce6ea5e2b58148f970bff1deaf07469a3b32a7b9ff4b7a13027d9e2 WatchSource:0}: Error finding container b5984f9afce6ea5e2b58148f970bff1deaf07469a3b32a7b9ff4b7a13027d9e2: Status 404 returned error can't find the container with id b5984f9afce6ea5e2b58148f970bff1deaf07469a3b32a7b9ff4b7a13027d9e2 Apr 21 03:58:12.258522 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.258493 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:12.258522 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.258526 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:58:12.258726 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.258551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:12.259089 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.259058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4c8739a-c60c-42cd-bc9f-8648b4999008-service-ca-bundle\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:12.260873 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.260851 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4c8739a-c60c-42cd-bc9f-8648b4999008-metrics-certs\") pod \"router-default-57d784794-br7xh\" (UID: \"f4c8739a-c60c-42cd-bc9f-8648b4999008\") " pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:12.260977 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.260857 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5db70efe-7a8f-4630-ba85-c061199340f6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gxtnl\" (UID: \"5db70efe-7a8f-4630-ba85-c061199340f6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:58:12.281738 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.281713 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" Apr 21 03:58:12.334887 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.334595 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:12.401489 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.401459 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl"] Apr 21 03:58:12.459729 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.459701 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-57d784794-br7xh"] Apr 21 03:58:12.462675 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:12.462649 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c8739a_c60c_42cd_bc9f_8648b4999008.slice/crio-f685548a27f6c8225586ff6c6181213391fa0120f578c1bbcd1790419bc30e06 WatchSource:0}: Error finding container f685548a27f6c8225586ff6c6181213391fa0120f578c1bbcd1790419bc30e06: Status 404 returned error can't find the container with id f685548a27f6c8225586ff6c6181213391fa0120f578c1bbcd1790419bc30e06 Apr 21 03:58:12.937356 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.937267 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-29n5x" event={"ID":"772f98a5-784d-4ae7-9617-a1b4f77424fb","Type":"ContainerStarted","Data":"5425d925f26b27924c0ecd356a46500cdceccefb9d87c825f684913988c1889c"} Apr 21 03:58:12.938646 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.938618 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tj58v" event={"ID":"c3592d56-3229-4a9f-8d19-2b45ed61d4c0","Type":"ContainerStarted","Data":"b5984f9afce6ea5e2b58148f970bff1deaf07469a3b32a7b9ff4b7a13027d9e2"} Apr 21 03:58:12.939733 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.939662 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" event={"ID":"5db70efe-7a8f-4630-ba85-c061199340f6","Type":"ContainerStarted","Data":"5ffae91297374da7fd146f61b817d10f2bea322ebdb9e3da8defa33cf916db84"} Apr 21 03:58:12.941879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.941829 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-57d784794-br7xh" event={"ID":"f4c8739a-c60c-42cd-bc9f-8648b4999008","Type":"ContainerStarted","Data":"ffe0abfb5845f34e56ecfae0e4ee821b357275fdfaca51254b971ef7d55c7225"} Apr 21 03:58:12.941879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.941860 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-57d784794-br7xh" event={"ID":"f4c8739a-c60c-42cd-bc9f-8648b4999008","Type":"ContainerStarted","Data":"f685548a27f6c8225586ff6c6181213391fa0120f578c1bbcd1790419bc30e06"} Apr 21 03:58:12.959418 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:12.959364 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-57d784794-br7xh" podStartSLOduration=16.959351656 podStartE2EDuration="16.959351656s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:58:12.958668971 +0000 UTC m=+49.829324967" watchObservedRunningTime="2026-04-21 03:58:12.959351656 +0000 UTC m=+49.830007584" Apr 21 03:58:13.338098 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:13.335857 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:13.340226 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:13.340101 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:13.944600 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:13.944549 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:13.955229 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:13.955196 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-57d784794-br7xh" Apr 21 03:58:14.952226 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:14.951510 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tj58v" event={"ID":"c3592d56-3229-4a9f-8d19-2b45ed61d4c0","Type":"ContainerStarted","Data":"37f98aece57e90f763dc37394ed44204379a67410f639ef7d795ac7eaea3b235"} Apr 21 03:58:14.955729 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:14.955198 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" event={"ID":"5db70efe-7a8f-4630-ba85-c061199340f6","Type":"ContainerStarted","Data":"c77fe7be209c51bc94a28c787d80cac7b3765b51895c1ed27c73be3eae4fe644"} Apr 21 03:58:14.955729 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:14.955230 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" event={"ID":"5db70efe-7a8f-4630-ba85-c061199340f6","Type":"ContainerStarted","Data":"e3c5d9ec43e7cb365654291f595329cfab1f656d823fcc5f8b81656696671a67"} Apr 21 03:58:14.964778 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:14.964733 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-29n5x" event={"ID":"772f98a5-784d-4ae7-9617-a1b4f77424fb","Type":"ContainerStarted","Data":"2de5ece039c5f8a58406b0f8a127b41b905b733f24104224310d227f9bec216f"} Apr 21 03:58:14.968225 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:14.967842 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tj58v" podStartSLOduration=17.341722247 podStartE2EDuration="19.967826799s" podCreationTimestamp="2026-04-21 03:57:55 +0000 UTC" firstStartedPulling="2026-04-21 03:58:12.089997624 +0000 UTC m=+48.960653555" lastFinishedPulling="2026-04-21 03:58:14.716102172 +0000 UTC m=+51.586758107" observedRunningTime="2026-04-21 03:58:14.966532874 +0000 UTC m=+51.837188828" watchObservedRunningTime="2026-04-21 03:58:14.967826799 +0000 UTC m=+51.838482751" Apr 21 03:58:14.981874 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:14.981829 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gxtnl" podStartSLOduration=16.729392503 podStartE2EDuration="18.981813252s" podCreationTimestamp="2026-04-21 03:57:56 +0000 UTC" firstStartedPulling="2026-04-21 03:58:12.467344476 +0000 UTC m=+49.338000407" lastFinishedPulling="2026-04-21 03:58:14.719765211 +0000 UTC m=+51.590421156" observedRunningTime="2026-04-21 03:58:14.980795001 +0000 UTC m=+51.851450953" watchObservedRunningTime="2026-04-21 03:58:14.981813252 +0000 UTC m=+51.852469205" Apr 21 03:58:15.968378 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:15.968340 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-29n5x" event={"ID":"772f98a5-784d-4ae7-9617-a1b4f77424fb","Type":"ContainerStarted","Data":"b43dc4c79fc3da91debb23091aa492be5a1e3edc0bbab32c7232c77be776a5ae"} Apr 21 03:58:15.968879 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:15.968860 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-29n5x" Apr 21 03:58:15.983273 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:15.983222 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-29n5x" podStartSLOduration=18.344611633 podStartE2EDuration="20.983208681s" podCreationTimestamp="2026-04-21 03:57:55 +0000 UTC" firstStartedPulling="2026-04-21 03:58:12.07524714 +0000 UTC m=+48.945903069" lastFinishedPulling="2026-04-21 03:58:14.713844184 +0000 UTC m=+51.584500117" observedRunningTime="2026-04-21 03:58:15.983055086 +0000 UTC m=+52.853711047" watchObservedRunningTime="2026-04-21 03:58:15.983208681 +0000 UTC m=+52.853864631" Apr 21 03:58:20.891514 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:20.891485 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gl9hq" Apr 21 03:58:23.589632 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.589591 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jbmcx"] Apr 21 03:58:23.594962 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.594939 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.597679 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.597293 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 03:58:23.597679 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.597529 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 03:58:23.598013 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.597989 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qz42s\"" Apr 21 03:58:23.598901 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.598875 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 03:58:23.600393 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.600368 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 03:58:23.607402 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.607110 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jbmcx"] Apr 21 03:58:23.637865 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.637829 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2cb64835-0101-4b02-85c8-73487693689f-data-volume\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.638002 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.637944 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2cb64835-0101-4b02-85c8-73487693689f-crio-socket\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.638062 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.637999 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2cb64835-0101-4b02-85c8-73487693689f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.638062 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.638028 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2cb64835-0101-4b02-85c8-73487693689f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.638189 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.638063 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2lhd\" (UniqueName: \"kubernetes.io/projected/2cb64835-0101-4b02-85c8-73487693689f-kube-api-access-p2lhd\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.738412 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.738380 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2cb64835-0101-4b02-85c8-73487693689f-crio-socket\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.738599 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.738437 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2cb64835-0101-4b02-85c8-73487693689f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.738599 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.738474 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2cb64835-0101-4b02-85c8-73487693689f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.738599 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.738518 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2lhd\" (UniqueName: \"kubernetes.io/projected/2cb64835-0101-4b02-85c8-73487693689f-kube-api-access-p2lhd\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.738599 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.738595 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2cb64835-0101-4b02-85c8-73487693689f-data-volume\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.739443 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.739006 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2cb64835-0101-4b02-85c8-73487693689f-data-volume\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.739443 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.739383 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2cb64835-0101-4b02-85c8-73487693689f-crio-socket\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.739680 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.739459 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2cb64835-0101-4b02-85c8-73487693689f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.741807 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.741746 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2cb64835-0101-4b02-85c8-73487693689f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.760016 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.759991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2lhd\" (UniqueName: \"kubernetes.io/projected/2cb64835-0101-4b02-85c8-73487693689f-kube-api-access-p2lhd\") pod \"insights-runtime-extractor-jbmcx\" (UID: \"2cb64835-0101-4b02-85c8-73487693689f\") " pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:23.909584 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:23.909499 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jbmcx" Apr 21 03:58:24.024568 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:24.024540 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jbmcx"] Apr 21 03:58:24.028670 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:24.028636 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb64835_0101_4b02_85c8_73487693689f.slice/crio-d9aa2762ec9ab70f0c9f781bd73c47cb013bbad0bf515b32873793edd90020a1 WatchSource:0}: Error finding container d9aa2762ec9ab70f0c9f781bd73c47cb013bbad0bf515b32873793edd90020a1: Status 404 returned error can't find the container with id d9aa2762ec9ab70f0c9f781bd73c47cb013bbad0bf515b32873793edd90020a1 Apr 21 03:58:24.995417 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:24.995382 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jbmcx" event={"ID":"2cb64835-0101-4b02-85c8-73487693689f","Type":"ContainerStarted","Data":"142890f32994f5170156c8f72059c983374fd240b16a46c1bb1a68831ef46719"} Apr 21 03:58:24.995755 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:24.995424 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jbmcx" event={"ID":"2cb64835-0101-4b02-85c8-73487693689f","Type":"ContainerStarted","Data":"5dd2652026aefed4c3f0c09985c8f03c68e95e35d05e6d6a56e4e6c9b6e6da35"} Apr 21 03:58:24.995755 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:24.995435 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jbmcx" event={"ID":"2cb64835-0101-4b02-85c8-73487693689f","Type":"ContainerStarted","Data":"d9aa2762ec9ab70f0c9f781bd73c47cb013bbad0bf515b32873793edd90020a1"} Apr 21 03:58:25.999019 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:25.998992 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jbmcx" event={"ID":"2cb64835-0101-4b02-85c8-73487693689f","Type":"ContainerStarted","Data":"9b615be8ae69f2825e3ce040255a359b9726871d0e48528ca6ffc0def311e4eb"} Apr 21 03:58:26.014925 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:26.014880 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jbmcx" podStartSLOduration=1.168383597 podStartE2EDuration="3.014866242s" podCreationTimestamp="2026-04-21 03:58:23 +0000 UTC" firstStartedPulling="2026-04-21 03:58:24.081580349 +0000 UTC m=+60.952236276" lastFinishedPulling="2026-04-21 03:58:25.928062989 +0000 UTC m=+62.798718921" observedRunningTime="2026-04-21 03:58:26.01378622 +0000 UTC m=+62.884442169" watchObservedRunningTime="2026-04-21 03:58:26.014866242 +0000 UTC m=+62.885522192" Apr 21 03:58:26.976543 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:26.976516 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-29n5x" Apr 21 03:58:29.377504 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.377468 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:58:29.377883 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.377524 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:58:29.379368 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.379348 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:58:29.379437 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.379348 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:58:29.390064 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.390046 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:58:29.390736 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.390721 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8746933a-dcd1-407c-8ebf-6ce3af9d58c0-metrics-certs\") pod \"network-metrics-daemon-x476t\" (UID: \"8746933a-dcd1-407c-8ebf-6ce3af9d58c0\") " pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:58:29.401313 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.401291 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdgrn\" (UniqueName: \"kubernetes.io/projected/89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1-kube-api-access-qdgrn\") pod \"network-check-target-f65gr\" (UID: \"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1\") " pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:58:29.514229 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.514200 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tn8dn\"" Apr 21 03:58:29.522935 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.520225 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-rcf6j\"" Apr 21 03:58:29.523099 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.522977 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:58:29.529038 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.529015 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x476t" Apr 21 03:58:29.648654 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.648584 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x476t"] Apr 21 03:58:29.651382 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:29.651357 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8746933a_dcd1_407c_8ebf_6ce3af9d58c0.slice/crio-5aeebd99e7a4dabac0f1daa8654b2060cf435c9fa305858cd557f7cf30e69155 WatchSource:0}: Error finding container 5aeebd99e7a4dabac0f1daa8654b2060cf435c9fa305858cd557f7cf30e69155: Status 404 returned error can't find the container with id 5aeebd99e7a4dabac0f1daa8654b2060cf435c9fa305858cd557f7cf30e69155 Apr 21 03:58:29.662340 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:29.662316 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-f65gr"] Apr 21 03:58:29.665303 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:29.665280 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89ffc5c7_9bcc_4d22_ad71_079b7d40a5d1.slice/crio-bb56d19631516e8618d242ebb645d538ef24de7e8118c0f5c1de085be2a95cb6 WatchSource:0}: Error finding container bb56d19631516e8618d242ebb645d538ef24de7e8118c0f5c1de085be2a95cb6: Status 404 returned error can't find the container with id bb56d19631516e8618d242ebb645d538ef24de7e8118c0f5c1de085be2a95cb6 Apr 21 03:58:30.009598 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:30.009563 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x476t" event={"ID":"8746933a-dcd1-407c-8ebf-6ce3af9d58c0","Type":"ContainerStarted","Data":"5aeebd99e7a4dabac0f1daa8654b2060cf435c9fa305858cd557f7cf30e69155"} Apr 21 03:58:30.010535 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:30.010511 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-f65gr" event={"ID":"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1","Type":"ContainerStarted","Data":"bb56d19631516e8618d242ebb645d538ef24de7e8118c0f5c1de085be2a95cb6"} Apr 21 03:58:32.018983 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:32.018942 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x476t" event={"ID":"8746933a-dcd1-407c-8ebf-6ce3af9d58c0","Type":"ContainerStarted","Data":"9e75f51f52494551adc766b791196785b2d6e1961b75479860fa66722066f207"} Apr 21 03:58:32.018983 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:32.018987 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x476t" event={"ID":"8746933a-dcd1-407c-8ebf-6ce3af9d58c0","Type":"ContainerStarted","Data":"a1bfe9a47d96631ec04da643e558e28345c3339dc2bbc72ca2900fc0071289e4"} Apr 21 03:58:32.033675 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:32.033583 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x476t" podStartSLOduration=67.787971928 podStartE2EDuration="1m9.033566795s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:58:29.653320418 +0000 UTC m=+66.523976350" lastFinishedPulling="2026-04-21 03:58:30.898915286 +0000 UTC m=+67.769571217" observedRunningTime="2026-04-21 03:58:32.033025358 +0000 UTC m=+68.903681312" watchObservedRunningTime="2026-04-21 03:58:32.033566795 +0000 UTC m=+68.904222745" Apr 21 03:58:33.022295 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:33.022260 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-f65gr" event={"ID":"89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1","Type":"ContainerStarted","Data":"efe9de08c4dacdf444b13c8bf3fcbbe8e9cc1e963f7a90129a4442c0e6009bf3"} Apr 21 03:58:33.022658 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:33.022477 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:58:33.036213 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:33.036173 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-f65gr" podStartSLOduration=67.260548407 podStartE2EDuration="1m10.036161584s" podCreationTimestamp="2026-04-21 03:57:23 +0000 UTC" firstStartedPulling="2026-04-21 03:58:29.666957239 +0000 UTC m=+66.537613167" lastFinishedPulling="2026-04-21 03:58:32.442570416 +0000 UTC m=+69.313226344" observedRunningTime="2026-04-21 03:58:33.035362085 +0000 UTC m=+69.906018034" watchObservedRunningTime="2026-04-21 03:58:33.036161584 +0000 UTC m=+69.906817534" Apr 21 03:58:40.131999 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.131885 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw"] Apr 21 03:58:40.135421 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.135400 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.136613 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.136588 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zx8jh"] Apr 21 03:58:40.137366 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.137340 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 03:58:40.137455 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.137378 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 03:58:40.137713 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.137693 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 03:58:40.137785 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.137716 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 03:58:40.137785 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.137697 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5vr6z\"" Apr 21 03:58:40.137885 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.137697 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 03:58:40.139638 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.139622 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.141338 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.141319 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 03:58:40.141462 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.141439 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 03:58:40.141462 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.141343 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pnzbl\"" Apr 21 03:58:40.141638 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.141322 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 03:58:40.142765 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.142742 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jcf7w"] Apr 21 03:58:40.145885 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.145864 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.146477 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.146454 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw"] Apr 21 03:58:40.148015 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.147854 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 03:58:40.148015 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.147912 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 03:58:40.148015 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.147921 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4kks8\"" Apr 21 03:58:40.148235 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.148171 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 03:58:40.163791 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.163556 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jcf7w"] Apr 21 03:58:40.251564 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251529 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.251708 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251578 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-metrics-client-ca\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.251708 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251627 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.251708 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251677 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-accelerators-collector-config\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.251811 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251704 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0fef8221-7674-4fa6-849b-be7e4e2ffda0-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.251811 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251763 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.251871 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251805 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fef8221-7674-4fa6-849b-be7e4e2ffda0-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.251871 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251835 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-root\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.251871 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251860 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7md64\" (UniqueName: \"kubernetes.io/projected/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-api-access-7md64\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.251960 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251890 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.251960 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251917 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p62fv\" (UniqueName: \"kubernetes.io/projected/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-kube-api-access-p62fv\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.251960 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.252113 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ngx9\" (UniqueName: \"kubernetes.io/projected/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-kube-api-access-7ngx9\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.252113 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.251998 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-sys\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.252113 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.252058 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-wtmp\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.252260 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.252131 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-tls\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.252260 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.252162 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.252260 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.252212 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-textfile\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.252260 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.252240 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.352997 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.352962 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7md64\" (UniqueName: \"kubernetes.io/projected/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-api-access-7md64\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.353217 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353011 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.353217 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353039 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p62fv\" (UniqueName: \"kubernetes.io/projected/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-kube-api-access-p62fv\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.353217 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:40.353150 2580 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 21 03:58:40.353382 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353212 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.353382 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:40.353225 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-tls podName:0fef8221-7674-4fa6-849b-be7e4e2ffda0 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:40.8532037 +0000 UTC m=+77.723859630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-jcf7w" (UID: "0fef8221-7674-4fa6-849b-be7e4e2ffda0") : secret "kube-state-metrics-tls" not found Apr 21 03:58:40.353382 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353260 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ngx9\" (UniqueName: \"kubernetes.io/projected/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-kube-api-access-7ngx9\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.353382 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353291 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-sys\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.353382 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353319 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-wtmp\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.353382 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:40.353327 2580 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 03:58:40.353681 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:40.353392 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-openshift-state-metrics-tls podName:d97ba0ce-69e2-41b1-a18a-edc1ade7702c nodeName:}" failed. No retries permitted until 2026-04-21 03:58:40.853374089 +0000 UTC m=+77.724030022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-2chdw" (UID: "d97ba0ce-69e2-41b1-a18a-edc1ade7702c") : secret "openshift-state-metrics-tls" not found Apr 21 03:58:40.353681 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353437 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-sys\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.353681 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353455 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-wtmp\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.353681 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353470 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-tls\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.353681 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353501 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.353681 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353562 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-textfile\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.353681 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353590 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.353681 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.353681 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353656 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-metrics-client-ca\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.354122 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353686 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.354122 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353722 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-accelerators-collector-config\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.354122 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353749 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0fef8221-7674-4fa6-849b-be7e4e2ffda0-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.354122 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.354122 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353800 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fef8221-7674-4fa6-849b-be7e4e2ffda0-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.354122 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-root\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.354122 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.353887 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-root\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.354464 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.354444 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-accelerators-collector-config\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.354619 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.354596 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.354840 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.354820 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-textfile\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.355205 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.355183 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fef8221-7674-4fa6-849b-be7e4e2ffda0-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.355604 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.355583 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-metrics-client-ca\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.356100 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.356061 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.356242 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.356207 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/0fef8221-7674-4fa6-849b-be7e4e2ffda0-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.356345 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.356302 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-tls\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.356951 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.356928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.357317 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.357295 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.358022 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.357998 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.361331 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.361303 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p62fv\" (UniqueName: \"kubernetes.io/projected/80dc2c7a-c7d6-4faf-91e9-83b408f0ea18-kube-api-access-p62fv\") pod \"node-exporter-zx8jh\" (UID: \"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18\") " pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.362022 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.361993 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7md64\" (UniqueName: \"kubernetes.io/projected/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-api-access-7md64\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.363008 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.362987 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ngx9\" (UniqueName: \"kubernetes.io/projected/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-kube-api-access-7ngx9\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.455810 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.455779 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zx8jh" Apr 21 03:58:40.477046 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:40.477011 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80dc2c7a_c7d6_4faf_91e9_83b408f0ea18.slice/crio-9d1bc1fae0ca1f73ca7e2163827b2f3d292a472d0de31fe70d9321156c9872d0 WatchSource:0}: Error finding container 9d1bc1fae0ca1f73ca7e2163827b2f3d292a472d0de31fe70d9321156c9872d0: Status 404 returned error can't find the container with id 9d1bc1fae0ca1f73ca7e2163827b2f3d292a472d0de31fe70d9321156c9872d0 Apr 21 03:58:40.859049 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.858963 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:40.859049 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.859012 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.861756 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.861726 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d97ba0ce-69e2-41b1-a18a-edc1ade7702c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2chdw\" (UID: \"d97ba0ce-69e2-41b1-a18a-edc1ade7702c\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:40.861874 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:40.861795 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0fef8221-7674-4fa6-849b-be7e4e2ffda0-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jcf7w\" (UID: \"0fef8221-7674-4fa6-849b-be7e4e2ffda0\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:41.043478 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.043445 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zx8jh" event={"ID":"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18","Type":"ContainerStarted","Data":"9d1bc1fae0ca1f73ca7e2163827b2f3d292a472d0de31fe70d9321156c9872d0"} Apr 21 03:58:41.049797 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.049773 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" Apr 21 03:58:41.060627 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.060605 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" Apr 21 03:58:41.273487 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.273457 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 03:58:41.277402 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.277369 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.280243 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280208 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 03:58:41.280243 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280229 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 03:58:41.280405 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280257 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-qdl9m\"" Apr 21 03:58:41.280458 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280436 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 03:58:41.280508 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280493 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 03:58:41.280575 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280558 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 03:58:41.280647 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280624 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 03:58:41.280799 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280765 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 03:58:41.280873 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280804 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 03:58:41.281354 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.280989 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 03:58:41.288048 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.287889 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw"] Apr 21 03:58:41.290992 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:41.290375 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd97ba0ce_69e2_41b1_a18a_edc1ade7702c.slice/crio-9d91f7394b8f77eeeb54390773ff1be5b2bd0d63e436e54329a65168283faa85 WatchSource:0}: Error finding container 9d91f7394b8f77eeeb54390773ff1be5b2bd0d63e436e54329a65168283faa85: Status 404 returned error can't find the container with id 9d91f7394b8f77eeeb54390773ff1be5b2bd0d63e436e54329a65168283faa85 Apr 21 03:58:41.297783 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.297761 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 03:58:41.303653 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.303541 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jcf7w"] Apr 21 03:58:41.314962 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:41.314929 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fef8221_7674_4fa6_849b_be7e4e2ffda0.slice/crio-2899fdbce0b43e2ab39f5b8ed892218165d0f54e3bde44a88c170c7dcf65f676 WatchSource:0}: Error finding container 2899fdbce0b43e2ab39f5b8ed892218165d0f54e3bde44a88c170c7dcf65f676: Status 404 returned error can't find the container with id 2899fdbce0b43e2ab39f5b8ed892218165d0f54e3bde44a88c170c7dcf65f676 Apr 21 03:58:41.361851 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.361829 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-web-config\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.361949 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.361865 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.361949 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.361882 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.361949 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.361898 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.362066 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.361956 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-tls-assets\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.362066 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.361995 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh5mw\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-kube-api-access-jh5mw\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.362066 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.362024 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-out\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.362066 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.362039 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.362066 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.362058 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.362267 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.362122 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.362267 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.362168 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.362267 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.362212 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-volume\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.362267 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.362235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463030 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463000 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-volume\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463194 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463042 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463194 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463069 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-web-config\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463194 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463113 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463194 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463131 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463194 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463149 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463194 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463174 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-tls-assets\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463435 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463205 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh5mw\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-kube-api-access-jh5mw\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463435 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463238 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-out\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463435 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463264 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463435 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463289 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463435 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463435 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.463338 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.463700 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:41.463585 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-trusted-ca-bundle podName:aa0455ea-62bc-4b3e-a367-2cd821d5fa11 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:41.963561546 +0000 UTC m=+78.834217487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11") : configmap references non-existent config key: ca-bundle.crt Apr 21 03:58:41.463796 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:41.463708 2580 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 21 03:58:41.463796 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:41.463763 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-main-tls podName:aa0455ea-62bc-4b3e-a367-2cd821d5fa11 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:41.963745437 +0000 UTC m=+78.834401376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11") : secret "alertmanager-main-tls" not found Apr 21 03:58:41.464182 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.464124 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.464700 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.464673 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.466447 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.466425 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-out\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.466583 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.466563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-web-config\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.467025 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.467002 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.467148 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.467045 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.467213 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.467163 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.467786 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.467766 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-volume\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.469519 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.469039 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-tls-assets\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.469519 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.469443 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.471676 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.471654 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh5mw\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-kube-api-access-jh5mw\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.967552 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.967505 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.967552 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.967551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.968461 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.968413 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:41.972458 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:41.972431 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:42.048705 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.048664 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" event={"ID":"0fef8221-7674-4fa6-849b-be7e4e2ffda0","Type":"ContainerStarted","Data":"2899fdbce0b43e2ab39f5b8ed892218165d0f54e3bde44a88c170c7dcf65f676"} Apr 21 03:58:42.050610 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.050577 2580 generic.go:358] "Generic (PLEG): container finished" podID="80dc2c7a-c7d6-4faf-91e9-83b408f0ea18" containerID="69629e52d6a56c27d179b46316dbd2f6a2b945b68f654a8b9c179db9a2d0323a" exitCode=0 Apr 21 03:58:42.050746 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.050691 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zx8jh" event={"ID":"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18","Type":"ContainerDied","Data":"69629e52d6a56c27d179b46316dbd2f6a2b945b68f654a8b9c179db9a2d0323a"} Apr 21 03:58:42.054073 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.054005 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" event={"ID":"d97ba0ce-69e2-41b1-a18a-edc1ade7702c","Type":"ContainerStarted","Data":"6667942f1ff39f2795aeca86bb97bb8657d8ede3e1c654d59d0accb165602d5b"} Apr 21 03:58:42.054073 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.054038 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" event={"ID":"d97ba0ce-69e2-41b1-a18a-edc1ade7702c","Type":"ContainerStarted","Data":"391a5843874bf5e5969c8ad1b0cd411d3b4b3a6c5dab7c59d18057f6f7e7684e"} Apr 21 03:58:42.054073 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.054053 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" event={"ID":"d97ba0ce-69e2-41b1-a18a-edc1ade7702c","Type":"ContainerStarted","Data":"9d91f7394b8f77eeeb54390773ff1be5b2bd0d63e436e54329a65168283faa85"} Apr 21 03:58:42.190485 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.190453 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 03:58:42.683701 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.683677 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-hffk9"] Apr 21 03:58:42.687170 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.687147 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hffk9" Apr 21 03:58:42.690466 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.690441 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 03:58:42.690466 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.690467 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-k5q44\"" Apr 21 03:58:42.690737 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.690719 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 03:58:42.697897 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.697809 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hffk9"] Apr 21 03:58:42.775247 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.775216 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm6sz\" (UniqueName: \"kubernetes.io/projected/fa8f0c1f-765d-4933-a93e-b01b48713a92-kube-api-access-lm6sz\") pod \"downloads-6bcc868b7-hffk9\" (UID: \"fa8f0c1f-765d-4933-a93e-b01b48713a92\") " pod="openshift-console/downloads-6bcc868b7-hffk9" Apr 21 03:58:42.830125 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.830067 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 03:58:42.835578 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:42.835538 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa0455ea_62bc_4b3e_a367_2cd821d5fa11.slice/crio-82d1963171ed420c54cd569c0c533bb04b5d20c41ba9842eb81fdb2fe30a3859 WatchSource:0}: Error finding container 82d1963171ed420c54cd569c0c533bb04b5d20c41ba9842eb81fdb2fe30a3859: Status 404 returned error can't find the container with id 82d1963171ed420c54cd569c0c533bb04b5d20c41ba9842eb81fdb2fe30a3859 Apr 21 03:58:42.876723 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.876691 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm6sz\" (UniqueName: \"kubernetes.io/projected/fa8f0c1f-765d-4933-a93e-b01b48713a92-kube-api-access-lm6sz\") pod \"downloads-6bcc868b7-hffk9\" (UID: \"fa8f0c1f-765d-4933-a93e-b01b48713a92\") " pod="openshift-console/downloads-6bcc868b7-hffk9" Apr 21 03:58:42.884812 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:42.884791 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm6sz\" (UniqueName: \"kubernetes.io/projected/fa8f0c1f-765d-4933-a93e-b01b48713a92-kube-api-access-lm6sz\") pod \"downloads-6bcc868b7-hffk9\" (UID: \"fa8f0c1f-765d-4933-a93e-b01b48713a92\") " pod="openshift-console/downloads-6bcc868b7-hffk9" Apr 21 03:58:43.048743 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.048702 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hffk9" Apr 21 03:58:43.059708 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.059655 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" event={"ID":"d97ba0ce-69e2-41b1-a18a-edc1ade7702c","Type":"ContainerStarted","Data":"e24f47da93662ef068f39ee4871c4b541c4c29186ce1b8b59895f80370b12f86"} Apr 21 03:58:43.063173 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.062981 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" event={"ID":"0fef8221-7674-4fa6-849b-be7e4e2ffda0","Type":"ContainerStarted","Data":"0f2c02aff8e9672b7572df8858006c25a0c3e5509c5fe25214a050b25616c20b"} Apr 21 03:58:43.063173 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.063029 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" event={"ID":"0fef8221-7674-4fa6-849b-be7e4e2ffda0","Type":"ContainerStarted","Data":"7648a5e734e419f2fa1a6a62043ab97e4feb606ea7475eaf52f6f02d298b276e"} Apr 21 03:58:43.063173 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.063045 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" event={"ID":"0fef8221-7674-4fa6-849b-be7e4e2ffda0","Type":"ContainerStarted","Data":"c820ea938d6b05e20504967af6ea02187852ef06da95f0ae3a34f353a3698983"} Apr 21 03:58:43.065144 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.065115 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerStarted","Data":"82d1963171ed420c54cd569c0c533bb04b5d20c41ba9842eb81fdb2fe30a3859"} Apr 21 03:58:43.067365 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.067336 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zx8jh" event={"ID":"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18","Type":"ContainerStarted","Data":"e4dc07ac1801916111dfe2c8be2c13a61db1c9cc949452a5d2e9b4932edae590"} Apr 21 03:58:43.067477 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.067371 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zx8jh" event={"ID":"80dc2c7a-c7d6-4faf-91e9-83b408f0ea18","Type":"ContainerStarted","Data":"fde30d8f78c4d8e13562bc1211fcfc00eee6c34629e70d167d8207f7c34441d1"} Apr 21 03:58:43.076620 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.076562 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2chdw" podStartSLOduration=1.820341608 podStartE2EDuration="3.076548653s" podCreationTimestamp="2026-04-21 03:58:40 +0000 UTC" firstStartedPulling="2026-04-21 03:58:41.436950151 +0000 UTC m=+78.307606079" lastFinishedPulling="2026-04-21 03:58:42.693157182 +0000 UTC m=+79.563813124" observedRunningTime="2026-04-21 03:58:43.075401129 +0000 UTC m=+79.946057076" watchObservedRunningTime="2026-04-21 03:58:43.076548653 +0000 UTC m=+79.947204602" Apr 21 03:58:43.098832 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.098780 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jcf7w" podStartSLOduration=1.7247837000000001 podStartE2EDuration="3.098745151s" podCreationTimestamp="2026-04-21 03:58:40 +0000 UTC" firstStartedPulling="2026-04-21 03:58:41.317179028 +0000 UTC m=+78.187834959" lastFinishedPulling="2026-04-21 03:58:42.691140466 +0000 UTC m=+79.561796410" observedRunningTime="2026-04-21 03:58:43.097963172 +0000 UTC m=+79.968619124" watchObservedRunningTime="2026-04-21 03:58:43.098745151 +0000 UTC m=+79.969401103" Apr 21 03:58:43.128813 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.128762 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zx8jh" podStartSLOduration=2.423772233 podStartE2EDuration="3.128742077s" podCreationTimestamp="2026-04-21 03:58:40 +0000 UTC" firstStartedPulling="2026-04-21 03:58:40.482826997 +0000 UTC m=+77.353482935" lastFinishedPulling="2026-04-21 03:58:41.187796837 +0000 UTC m=+78.058452779" observedRunningTime="2026-04-21 03:58:43.127651234 +0000 UTC m=+79.998307176" watchObservedRunningTime="2026-04-21 03:58:43.128742077 +0000 UTC m=+79.999398028" Apr 21 03:58:43.175651 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:43.175618 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hffk9"] Apr 21 03:58:43.180603 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:43.180571 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa8f0c1f_765d_4933_a93e_b01b48713a92.slice/crio-ea4d9464cc916116f0c62700eea803cc61c85bace851a58edd9a9b06e97e5890 WatchSource:0}: Error finding container ea4d9464cc916116f0c62700eea803cc61c85bace851a58edd9a9b06e97e5890: Status 404 returned error can't find the container with id ea4d9464cc916116f0c62700eea803cc61c85bace851a58edd9a9b06e97e5890 Apr 21 03:58:44.072235 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.072142 2580 generic.go:358] "Generic (PLEG): container finished" podID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerID="05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0" exitCode=0 Apr 21 03:58:44.072235 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.072226 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerDied","Data":"05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0"} Apr 21 03:58:44.073810 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.073742 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hffk9" event={"ID":"fa8f0c1f-765d-4933-a93e-b01b48713a92","Type":"ContainerStarted","Data":"ea4d9464cc916116f0c62700eea803cc61c85bace851a58edd9a9b06e97e5890"} Apr 21 03:58:44.540075 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.540041 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6945946c54-md5kz"] Apr 21 03:58:44.543528 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.543506 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.545840 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.545813 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 03:58:44.547298 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.547272 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 03:58:44.547800 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.547773 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-asreb0nj148jg\"" Apr 21 03:58:44.547891 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.547822 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 03:58:44.547957 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.547914 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 03:58:44.548008 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.547823 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-hmgtm\"" Apr 21 03:58:44.554133 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.554108 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6945946c54-md5kz"] Apr 21 03:58:44.590297 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.590258 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/074fb1e8-3fa3-4376-9a76-b629900a63b2-secret-metrics-server-client-certs\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.590435 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.590358 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zjs\" (UniqueName: \"kubernetes.io/projected/074fb1e8-3fa3-4376-9a76-b629900a63b2-kube-api-access-s7zjs\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.590435 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.590392 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074fb1e8-3fa3-4376-9a76-b629900a63b2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.590435 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.590420 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074fb1e8-3fa3-4376-9a76-b629900a63b2-client-ca-bundle\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.590607 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.590446 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/074fb1e8-3fa3-4376-9a76-b629900a63b2-metrics-server-audit-profiles\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.590607 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.590474 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/074fb1e8-3fa3-4376-9a76-b629900a63b2-secret-metrics-server-tls\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.590607 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.590499 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/074fb1e8-3fa3-4376-9a76-b629900a63b2-audit-log\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.691962 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.691926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zjs\" (UniqueName: \"kubernetes.io/projected/074fb1e8-3fa3-4376-9a76-b629900a63b2-kube-api-access-s7zjs\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.692176 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.691979 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074fb1e8-3fa3-4376-9a76-b629900a63b2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.692176 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.692011 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074fb1e8-3fa3-4376-9a76-b629900a63b2-client-ca-bundle\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.692176 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.692040 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/074fb1e8-3fa3-4376-9a76-b629900a63b2-metrics-server-audit-profiles\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.692176 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.692075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/074fb1e8-3fa3-4376-9a76-b629900a63b2-secret-metrics-server-tls\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.692176 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.692130 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/074fb1e8-3fa3-4376-9a76-b629900a63b2-audit-log\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.692376 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.692182 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/074fb1e8-3fa3-4376-9a76-b629900a63b2-secret-metrics-server-client-certs\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.692813 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.692760 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/074fb1e8-3fa3-4376-9a76-b629900a63b2-audit-log\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.693151 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.693126 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/074fb1e8-3fa3-4376-9a76-b629900a63b2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.693385 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.693340 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/074fb1e8-3fa3-4376-9a76-b629900a63b2-metrics-server-audit-profiles\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.695365 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.695335 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/074fb1e8-3fa3-4376-9a76-b629900a63b2-client-ca-bundle\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.695469 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.695390 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/074fb1e8-3fa3-4376-9a76-b629900a63b2-secret-metrics-server-tls\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.695547 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.695531 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/074fb1e8-3fa3-4376-9a76-b629900a63b2-secret-metrics-server-client-certs\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.700335 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.700313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zjs\" (UniqueName: \"kubernetes.io/projected/074fb1e8-3fa3-4376-9a76-b629900a63b2-kube-api-access-s7zjs\") pod \"metrics-server-6945946c54-md5kz\" (UID: \"074fb1e8-3fa3-4376-9a76-b629900a63b2\") " pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.860801 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.860717 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:58:44.949407 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.949376 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw"] Apr 21 03:58:44.954174 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.954054 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" Apr 21 03:58:44.960255 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.956530 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 03:58:44.960255 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.956603 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-fc86z\"" Apr 21 03:58:44.966099 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.964140 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw"] Apr 21 03:58:44.996211 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:44.996157 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8ef523e-3eec-4098-936e-2664b106f3c3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xzsgw\" (UID: \"d8ef523e-3eec-4098-936e-2664b106f3c3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" Apr 21 03:58:45.011318 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:45.011271 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6945946c54-md5kz"] Apr 21 03:58:45.097747 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:45.097654 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8ef523e-3eec-4098-936e-2664b106f3c3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xzsgw\" (UID: \"d8ef523e-3eec-4098-936e-2664b106f3c3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" Apr 21 03:58:45.098147 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:45.097806 2580 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 03:58:45.098147 ip-10-0-128-88 kubenswrapper[2580]: E0421 03:58:45.097877 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8ef523e-3eec-4098-936e-2664b106f3c3-monitoring-plugin-cert podName:d8ef523e-3eec-4098-936e-2664b106f3c3 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:45.597856547 +0000 UTC m=+82.468512482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/d8ef523e-3eec-4098-936e-2664b106f3c3-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-xzsgw" (UID: "d8ef523e-3eec-4098-936e-2664b106f3c3") : secret "monitoring-plugin-cert" not found Apr 21 03:58:45.330817 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:45.330773 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod074fb1e8_3fa3_4376_9a76_b629900a63b2.slice/crio-75cf20b03b41bb5e032e8e46c231332142384d2a97263bbf08a5e400659d3bd8 WatchSource:0}: Error finding container 75cf20b03b41bb5e032e8e46c231332142384d2a97263bbf08a5e400659d3bd8: Status 404 returned error can't find the container with id 75cf20b03b41bb5e032e8e46c231332142384d2a97263bbf08a5e400659d3bd8 Apr 21 03:58:45.602414 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:45.602386 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8ef523e-3eec-4098-936e-2664b106f3c3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xzsgw\" (UID: \"d8ef523e-3eec-4098-936e-2664b106f3c3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" Apr 21 03:58:45.604759 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:45.604640 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8ef523e-3eec-4098-936e-2664b106f3c3-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xzsgw\" (UID: \"d8ef523e-3eec-4098-936e-2664b106f3c3\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" Apr 21 03:58:45.871401 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:45.871300 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" Apr 21 03:58:46.026290 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.026261 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw"] Apr 21 03:58:46.028802 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:46.028771 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8ef523e_3eec_4098_936e_2664b106f3c3.slice/crio-157a3286b3d14a831e87c9bd158e4ace5728e3da34b9105c9c98bb1ac27d8647 WatchSource:0}: Error finding container 157a3286b3d14a831e87c9bd158e4ace5728e3da34b9105c9c98bb1ac27d8647: Status 404 returned error can't find the container with id 157a3286b3d14a831e87c9bd158e4ace5728e3da34b9105c9c98bb1ac27d8647 Apr 21 03:58:46.082731 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.082693 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" event={"ID":"d8ef523e-3eec-4098-936e-2664b106f3c3","Type":"ContainerStarted","Data":"157a3286b3d14a831e87c9bd158e4ace5728e3da34b9105c9c98bb1ac27d8647"} Apr 21 03:58:46.086419 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.086389 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerStarted","Data":"69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0"} Apr 21 03:58:46.086547 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.086427 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerStarted","Data":"3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082"} Apr 21 03:58:46.086547 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.086442 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerStarted","Data":"aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8"} Apr 21 03:58:46.086547 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.086455 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerStarted","Data":"204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c"} Apr 21 03:58:46.086547 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.086467 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerStarted","Data":"bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9"} Apr 21 03:58:46.088067 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.088014 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6945946c54-md5kz" event={"ID":"074fb1e8-3fa3-4376-9a76-b629900a63b2","Type":"ContainerStarted","Data":"75cf20b03b41bb5e032e8e46c231332142384d2a97263bbf08a5e400659d3bd8"} Apr 21 03:58:46.558341 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.558310 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 03:58:46.563116 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.562995 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.570640 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.569998 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 03:58:46.570640 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.570376 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 03:58:46.570640 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.570557 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 03:58:46.570984 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.570968 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 03:58:46.570984 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.570979 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 03:58:46.571135 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.571022 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 03:58:46.571135 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.571029 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7tbhb\"" Apr 21 03:58:46.571248 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.570972 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 03:58:46.571248 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.570970 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cu7ihvba280bf\"" Apr 21 03:58:46.572413 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.572395 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 03:58:46.572622 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.572596 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 03:58:46.573843 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.573823 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 03:58:46.576023 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.576002 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 03:58:46.583106 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.583069 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 03:58:46.597040 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.596814 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 03:58:46.612401 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612290 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612401 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612335 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612401 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612582 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612453 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612582 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612491 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612582 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612516 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612582 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612711 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612697 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612749 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612723 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612790 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612755 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612906 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612828 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzhtt\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-kube-api-access-lzhtt\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.612906 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612865 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.613044 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.612960 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-config\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.613116 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.613040 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.613184 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.613117 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.613184 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.613145 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-web-config\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.613282 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.613188 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.613282 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.613218 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-config-out\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714040 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714003 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714220 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714057 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-config\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714220 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714103 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714220 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714138 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714220 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714160 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-web-config\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714220 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714184 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714220 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714210 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-config-out\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714514 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714279 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714514 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714306 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714514 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714514 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714378 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714514 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714403 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714514 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714429 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714514 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714459 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714514 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714488 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714868 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714515 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714868 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714552 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.714868 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.714595 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzhtt\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-kube-api-access-lzhtt\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.715415 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.715349 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.715862 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.715833 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.716922 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.716900 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.719457 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.718197 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.719457 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.718799 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.719457 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.719421 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.723856 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.723835 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-config\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.723940 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.723918 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.725133 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.724935 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.725133 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.725095 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzhtt\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-kube-api-access-lzhtt\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.725665 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.725614 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-web-config\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.725794 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.725684 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.725794 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.725748 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.726254 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.726231 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-config-out\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.726559 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.726513 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.726963 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.726906 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.727303 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.727261 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.728725 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.727589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:46.880759 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:46.880157 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:58:47.112856 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:47.112832 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 03:58:47.646871 ip-10-0-128-88 kubenswrapper[2580]: W0421 03:58:47.646824 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27127b19_097c_4c7f_a91d_7309ae32d53a.slice/crio-a02b1867db918fff224aeea2005432f574da0dd19d203fdcd1ea5b530f91821c WatchSource:0}: Error finding container a02b1867db918fff224aeea2005432f574da0dd19d203fdcd1ea5b530f91821c: Status 404 returned error can't find the container with id a02b1867db918fff224aeea2005432f574da0dd19d203fdcd1ea5b530f91821c Apr 21 03:58:48.096859 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.096763 2580 generic.go:358] "Generic (PLEG): container finished" podID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerID="208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420" exitCode=0 Apr 21 03:58:48.097006 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.096853 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerDied","Data":"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420"} Apr 21 03:58:48.097006 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.096893 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerStarted","Data":"a02b1867db918fff224aeea2005432f574da0dd19d203fdcd1ea5b530f91821c"} Apr 21 03:58:48.102143 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.102119 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerStarted","Data":"4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9"} Apr 21 03:58:48.103575 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.103540 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6945946c54-md5kz" event={"ID":"074fb1e8-3fa3-4376-9a76-b629900a63b2","Type":"ContainerStarted","Data":"218f10d3529365b71c2055656131e6d61b2c526700235428f2893cbcd330331c"} Apr 21 03:58:48.105038 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.105018 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" event={"ID":"d8ef523e-3eec-4098-936e-2664b106f3c3","Type":"ContainerStarted","Data":"00cc2c07d08200b17c051894c81d2165a891fe6cfdf7bbeea5a1602ddfb5b8ae"} Apr 21 03:58:48.105471 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.105390 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" Apr 21 03:58:48.112239 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.112221 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" Apr 21 03:58:48.145991 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.145934 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.027661199 podStartE2EDuration="7.145918048s" podCreationTimestamp="2026-04-21 03:58:41 +0000 UTC" firstStartedPulling="2026-04-21 03:58:42.838165983 +0000 UTC m=+79.708821925" lastFinishedPulling="2026-04-21 03:58:46.956422832 +0000 UTC m=+83.827078774" observedRunningTime="2026-04-21 03:58:48.144737975 +0000 UTC m=+85.015393924" watchObservedRunningTime="2026-04-21 03:58:48.145918048 +0000 UTC m=+85.016573999" Apr 21 03:58:48.166218 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.165731 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6945946c54-md5kz" podStartSLOduration=1.6755237840000001 podStartE2EDuration="4.165714915s" podCreationTimestamp="2026-04-21 03:58:44 +0000 UTC" firstStartedPulling="2026-04-21 03:58:45.333239161 +0000 UTC m=+82.203895100" lastFinishedPulling="2026-04-21 03:58:47.823430014 +0000 UTC m=+84.694086231" observedRunningTime="2026-04-21 03:58:48.163965752 +0000 UTC m=+85.034621699" watchObservedRunningTime="2026-04-21 03:58:48.165714915 +0000 UTC m=+85.036370867" Apr 21 03:58:48.182857 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:48.182794 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xzsgw" podStartSLOduration=2.384253957 podStartE2EDuration="4.182776072s" podCreationTimestamp="2026-04-21 03:58:44 +0000 UTC" firstStartedPulling="2026-04-21 03:58:46.031371747 +0000 UTC m=+82.902027690" lastFinishedPulling="2026-04-21 03:58:47.829893871 +0000 UTC m=+84.700549805" observedRunningTime="2026-04-21 03:58:48.17841383 +0000 UTC m=+85.049069782" watchObservedRunningTime="2026-04-21 03:58:48.182776072 +0000 UTC m=+85.053432025" Apr 21 03:58:51.119357 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:51.119231 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerStarted","Data":"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da"} Apr 21 03:58:51.119357 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:51.119284 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerStarted","Data":"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07"} Apr 21 03:58:53.130662 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:53.130627 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerStarted","Data":"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa"} Apr 21 03:58:53.130662 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:53.130670 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerStarted","Data":"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927"} Apr 21 03:58:53.131155 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:53.130683 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerStarted","Data":"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a"} Apr 21 03:58:53.131155 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:53.130695 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerStarted","Data":"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079"} Apr 21 03:58:53.156211 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:53.156157 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.5649484019999997 podStartE2EDuration="7.156137372s" podCreationTimestamp="2026-04-21 03:58:46 +0000 UTC" firstStartedPulling="2026-04-21 03:58:48.098298979 +0000 UTC m=+84.968954909" lastFinishedPulling="2026-04-21 03:58:52.689487947 +0000 UTC m=+89.560143879" observedRunningTime="2026-04-21 03:58:53.154133651 +0000 UTC m=+90.024789602" watchObservedRunningTime="2026-04-21 03:58:53.156137372 +0000 UTC m=+90.026793325" Apr 21 03:58:56.880963 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:58:56.880925 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:59:01.163956 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:01.163915 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hffk9" event={"ID":"fa8f0c1f-765d-4933-a93e-b01b48713a92","Type":"ContainerStarted","Data":"e31d5696030e170856df44240da46a69fd966a5057691176cf3e802b98e57726"} Apr 21 03:59:01.164482 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:01.164452 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-hffk9" Apr 21 03:59:01.179484 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:01.179444 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-hffk9" podStartSLOduration=2.076271612 podStartE2EDuration="19.179431361s" podCreationTimestamp="2026-04-21 03:58:42 +0000 UTC" firstStartedPulling="2026-04-21 03:58:43.188274776 +0000 UTC m=+80.058930719" lastFinishedPulling="2026-04-21 03:59:00.291434535 +0000 UTC m=+97.162090468" observedRunningTime="2026-04-21 03:59:01.177704317 +0000 UTC m=+98.048360268" watchObservedRunningTime="2026-04-21 03:59:01.179431361 +0000 UTC m=+98.050087311" Apr 21 03:59:01.180769 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:01.180745 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-hffk9" Apr 21 03:59:04.028305 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:04.028272 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-f65gr" Apr 21 03:59:04.861791 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:04.861747 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:59:04.861960 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:04.861801 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:59:24.867128 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:24.867076 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:59:24.870902 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:24.870881 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6945946c54-md5kz" Apr 21 03:59:30.250603 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:30.250570 2580 generic.go:358] "Generic (PLEG): container finished" podID="da84618b-ba61-4e8a-a07c-4ccc02317ed5" containerID="886f3d0f666feda9b357870b251d000ec41dc03341327d27a29bf639dbfd7526" exitCode=0 Apr 21 03:59:30.251131 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:30.250633 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" event={"ID":"da84618b-ba61-4e8a-a07c-4ccc02317ed5","Type":"ContainerDied","Data":"886f3d0f666feda9b357870b251d000ec41dc03341327d27a29bf639dbfd7526"} Apr 21 03:59:30.251131 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:30.250938 2580 scope.go:117] "RemoveContainer" containerID="886f3d0f666feda9b357870b251d000ec41dc03341327d27a29bf639dbfd7526" Apr 21 03:59:31.254797 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:31.254764 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7f2dn" event={"ID":"da84618b-ba61-4e8a-a07c-4ccc02317ed5","Type":"ContainerStarted","Data":"cc60de36d8daac7838d1c987380847462d583e1d6b7875e3b08e590b160864f9"} Apr 21 03:59:46.881347 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:46.881305 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:59:46.896901 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:46.896875 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 03:59:47.312937 ip-10-0-128-88 kubenswrapper[2580]: I0421 03:59:47.312907 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:00.652394 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:00.652360 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:00:00.653182 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:00.652926 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="alertmanager" containerID="cri-o://bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9" gracePeriod=120 Apr 21 04:00:00.653182 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:00.652963 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy-metric" containerID="cri-o://69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0" gracePeriod=120 Apr 21 04:00:00.653182 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:00.652984 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy-web" containerID="cri-o://aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8" gracePeriod=120 Apr 21 04:00:00.653182 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:00.653024 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy" containerID="cri-o://3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082" gracePeriod=120 Apr 21 04:00:00.653182 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:00.653060 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="prom-label-proxy" containerID="cri-o://4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9" gracePeriod=120 Apr 21 04:00:00.653182 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:00.653047 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="config-reloader" containerID="cri-o://204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c" gracePeriod=120 Apr 21 04:00:01.345044 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345004 2580 generic.go:358] "Generic (PLEG): container finished" podID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerID="4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9" exitCode=0 Apr 21 04:00:01.345044 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345032 2580 generic.go:358] "Generic (PLEG): container finished" podID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerID="69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0" exitCode=0 Apr 21 04:00:01.345044 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345041 2580 generic.go:358] "Generic (PLEG): container finished" podID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerID="3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082" exitCode=0 Apr 21 04:00:01.345044 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345048 2580 generic.go:358] "Generic (PLEG): container finished" podID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerID="204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c" exitCode=0 Apr 21 04:00:01.345322 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345056 2580 generic.go:358] "Generic (PLEG): container finished" podID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerID="bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9" exitCode=0 Apr 21 04:00:01.345322 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345099 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerDied","Data":"4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9"} Apr 21 04:00:01.345322 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345134 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerDied","Data":"69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0"} Apr 21 04:00:01.345322 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345145 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerDied","Data":"3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082"} Apr 21 04:00:01.345322 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345155 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerDied","Data":"204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c"} Apr 21 04:00:01.345322 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.345164 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerDied","Data":"bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9"} Apr 21 04:00:01.895833 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:01.895809 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.035773 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.035672 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-main-db\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.035773 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.035711 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-cluster-tls-config\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.035773 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.035761 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-trusted-ca-bundle\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036059 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.035780 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-web-config\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036059 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.035804 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-web\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036059 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.035894 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-volume\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036059 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.035955 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-out\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036059 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.035987 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-tls-assets\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036059 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.036029 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-metric\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036381 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.036123 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036381 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.036161 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-main-tls\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036381 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.036200 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh5mw\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-kube-api-access-jh5mw\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036381 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.036236 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-metrics-client-ca\") pod \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\" (UID: \"aa0455ea-62bc-4b3e-a367-2cd821d5fa11\") " Apr 21 04:00:02.036941 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.036028 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:00:02.036941 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.036210 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:02.036941 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.036819 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:02.038889 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.038866 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-out" (OuterVolumeSpecName: "config-out") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:00:02.039119 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.039065 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:02.039587 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.039560 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-volume" (OuterVolumeSpecName: "config-volume") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:02.039662 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.039584 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:02.039841 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.039815 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:02.039957 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.039844 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:02.040457 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.040442 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-kube-api-access-jh5mw" (OuterVolumeSpecName: "kube-api-access-jh5mw") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "kube-api-access-jh5mw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:02.041109 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.041091 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:02.043708 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.043644 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:02.050036 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.050013 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-web-config" (OuterVolumeSpecName: "web-config") pod "aa0455ea-62bc-4b3e-a367-2cd821d5fa11" (UID: "aa0455ea-62bc-4b3e-a367-2cd821d5fa11"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:02.137696 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137656 2580 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-volume\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137696 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137687 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-config-out\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137696 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137696 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-tls-assets\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137696 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137705 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137955 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137714 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137955 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137724 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-main-tls\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137955 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137733 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jh5mw\" (UniqueName: \"kubernetes.io/projected/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-kube-api-access-jh5mw\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137955 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137741 2580 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-metrics-client-ca\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137955 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137751 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-main-db\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137955 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137760 2580 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-cluster-tls-config\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137955 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137770 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137955 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137778 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-web-config\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.137955 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.137788 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/aa0455ea-62bc-4b3e-a367-2cd821d5fa11-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:02.350514 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.350428 2580 generic.go:358] "Generic (PLEG): container finished" podID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerID="aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8" exitCode=0 Apr 21 04:00:02.350514 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.350482 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerDied","Data":"aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8"} Apr 21 04:00:02.350514 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.350510 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"aa0455ea-62bc-4b3e-a367-2cd821d5fa11","Type":"ContainerDied","Data":"82d1963171ed420c54cd569c0c533bb04b5d20c41ba9842eb81fdb2fe30a3859"} Apr 21 04:00:02.350712 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.350525 2580 scope.go:117] "RemoveContainer" containerID="4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9" Apr 21 04:00:02.350712 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.350545 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 04:00:02.358288 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.358124 2580 scope.go:117] "RemoveContainer" containerID="69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0" Apr 21 04:00:02.364986 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.364971 2580 scope.go:117] "RemoveContainer" containerID="3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082" Apr 21 04:00:02.371969 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.371925 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:00:02.372669 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.372650 2580 scope.go:117] "RemoveContainer" containerID="aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8" Apr 21 04:00:02.378569 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.378546 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 04:00:02.381117 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.381099 2580 scope.go:117] "RemoveContainer" containerID="204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c" Apr 21 04:00:02.387492 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.387477 2580 scope.go:117] "RemoveContainer" containerID="bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9" Apr 21 04:00:02.393639 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.393621 2580 scope.go:117] "RemoveContainer" containerID="05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0" Apr 21 04:00:02.400390 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.400374 2580 scope.go:117] "RemoveContainer" containerID="4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9" Apr 21 04:00:02.400652 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:02.400626 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9\": container with ID starting with 4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9 not found: ID does not exist" containerID="4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9" Apr 21 04:00:02.400697 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.400663 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9"} err="failed to get container status \"4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9\": rpc error: code = NotFound desc = could not find container \"4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9\": container with ID starting with 4f833104c641733435b853a579686379850c95ab20a1eb3c3a59a86336241ff9 not found: ID does not exist" Apr 21 04:00:02.400738 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.400696 2580 scope.go:117] "RemoveContainer" containerID="69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0" Apr 21 04:00:02.400890 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:02.400876 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0\": container with ID starting with 69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0 not found: ID does not exist" containerID="69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0" Apr 21 04:00:02.400950 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.400895 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0"} err="failed to get container status \"69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0\": rpc error: code = NotFound desc = could not find container \"69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0\": container with ID starting with 69e689156d0bb930277ea22053a06d3ac269256444ce0cb6c394a051f78f12a0 not found: ID does not exist" Apr 21 04:00:02.400950 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.400908 2580 scope.go:117] "RemoveContainer" containerID="3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082" Apr 21 04:00:02.401158 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:02.401140 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082\": container with ID starting with 3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082 not found: ID does not exist" containerID="3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082" Apr 21 04:00:02.401205 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.401166 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082"} err="failed to get container status \"3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082\": rpc error: code = NotFound desc = could not find container \"3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082\": container with ID starting with 3108df52909da40c76f8f3f0e80179c650e0b6a5fac47b7d3c28b754f1c3f082 not found: ID does not exist" Apr 21 04:00:02.401205 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.401183 2580 scope.go:117] "RemoveContainer" containerID="aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8" Apr 21 04:00:02.401414 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:02.401397 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8\": container with ID starting with aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8 not found: ID does not exist" containerID="aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8" Apr 21 04:00:02.401461 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.401422 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8"} err="failed to get container status \"aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8\": rpc error: code = NotFound desc = could not find container \"aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8\": container with ID starting with aefeeee778462e79e43307c5005eabd16bed62167046ab343ba02416eab7b4c8 not found: ID does not exist" Apr 21 04:00:02.401461 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.401438 2580 scope.go:117] "RemoveContainer" containerID="204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c" Apr 21 04:00:02.401635 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:02.401617 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c\": container with ID starting with 204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c not found: ID does not exist" containerID="204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c" Apr 21 04:00:02.401689 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.401644 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c"} err="failed to get container status \"204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c\": rpc error: code = NotFound desc = could not find container \"204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c\": container with ID starting with 204e2c149c48c2dd7910ea97f3ea9e7b888bd65738f91c431396116fe6a18d4c not found: ID does not exist" Apr 21 04:00:02.401689 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.401666 2580 scope.go:117] "RemoveContainer" containerID="bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9" Apr 21 04:00:02.401901 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:02.401886 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9\": container with ID starting with bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9 not found: ID does not exist" containerID="bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9" Apr 21 04:00:02.401941 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.401904 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9"} err="failed to get container status \"bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9\": rpc error: code = NotFound desc = could not find container \"bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9\": container with ID starting with bf2bf77049e9f2c0837fe624bf58d1c638798677c02a0da469f6b42eb8ac42f9 not found: ID does not exist" Apr 21 04:00:02.401941 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.401915 2580 scope.go:117] "RemoveContainer" containerID="05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0" Apr 21 04:00:02.402107 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:02.402072 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0\": container with ID starting with 05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0 not found: ID does not exist" containerID="05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0" Apr 21 04:00:02.402188 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:02.402110 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0"} err="failed to get container status \"05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0\": rpc error: code = NotFound desc = could not find container \"05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0\": container with ID starting with 05bf48a63e60ae2331a6246215afbbf0721aee9cbb920a1663414f7ae0f4efb0 not found: ID does not exist" Apr 21 04:00:03.706517 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:03.706481 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" path="/var/lib/kubelet/pods/aa0455ea-62bc-4b3e-a367-2cd821d5fa11/volumes" Apr 21 04:00:04.916353 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:04.916316 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:00:04.917019 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:04.916901 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="prometheus" containerID="cri-o://0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" gracePeriod=600 Apr 21 04:00:04.917019 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:04.916972 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="thanos-sidecar" containerID="cri-o://1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" gracePeriod=600 Apr 21 04:00:04.917197 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:04.917027 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="config-reloader" containerID="cri-o://9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" gracePeriod=600 Apr 21 04:00:04.917197 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:04.916977 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy-web" containerID="cri-o://b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" gracePeriod=600 Apr 21 04:00:04.917197 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:04.916986 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy-thanos" containerID="cri-o://4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" gracePeriod=600 Apr 21 04:00:04.917357 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:04.917253 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy" containerID="cri-o://a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" gracePeriod=600 Apr 21 04:00:05.162069 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.162046 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.265337 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265305 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-tls\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265545 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265353 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-metrics-client-certs\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265545 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265387 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-config-out\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265545 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265411 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-web-config\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265545 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265466 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-kube-rbac-proxy\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265545 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265496 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265545 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265522 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-grpc-tls\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265843 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265557 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-thanos-prometheus-http-client-file\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265843 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265587 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265843 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265627 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzhtt\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-kube-api-access-lzhtt\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265843 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265663 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-config\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265843 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265690 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-metrics-client-ca\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265843 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265729 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-trusted-ca-bundle\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265843 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265769 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-tls-assets\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265843 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265796 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-serving-certs-ca-bundle\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.265843 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265840 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-rulefiles-0\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.266304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265876 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-kubelet-serving-ca-bundle\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.266304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.265919 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-db\") pod \"27127b19-097c-4c7f-a91d-7309ae32d53a\" (UID: \"27127b19-097c-4c7f-a91d-7309ae32d53a\") " Apr 21 04:00:05.267363 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.267127 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:05.268165 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.267606 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:05.268165 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.267780 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:00:05.268165 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.268115 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:05.268793 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.268740 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:05.269106 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.268897 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:05.270149 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.269835 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:05.270149 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.269938 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:05.270149 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.269960 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:05.270149 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.269999 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:05.270149 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.270013 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:05.270149 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.270125 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-config-out" (OuterVolumeSpecName: "config-out") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:00:05.270554 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.270320 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-kube-api-access-lzhtt" (OuterVolumeSpecName: "kube-api-access-lzhtt") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "kube-api-access-lzhtt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:05.270610 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.270595 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:05.270765 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.270744 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:05.271002 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.270985 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:05.271065 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.271040 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-config" (OuterVolumeSpecName: "config") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:05.279199 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.279179 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-web-config" (OuterVolumeSpecName: "web-config") pod "27127b19-097c-4c7f-a91d-7309ae32d53a" (UID: "27127b19-097c-4c7f-a91d-7309ae32d53a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:05.363071 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363039 2580 generic.go:358] "Generic (PLEG): container finished" podID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerID="4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" exitCode=0 Apr 21 04:00:05.363071 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363068 2580 generic.go:358] "Generic (PLEG): container finished" podID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerID="a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" exitCode=0 Apr 21 04:00:05.363071 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363089 2580 generic.go:358] "Generic (PLEG): container finished" podID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerID="b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" exitCode=0 Apr 21 04:00:05.363304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363097 2580 generic.go:358] "Generic (PLEG): container finished" podID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerID="1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" exitCode=0 Apr 21 04:00:05.363304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363103 2580 generic.go:358] "Generic (PLEG): container finished" podID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerID="9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" exitCode=0 Apr 21 04:00:05.363304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363111 2580 generic.go:358] "Generic (PLEG): container finished" podID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerID="0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" exitCode=0 Apr 21 04:00:05.363304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363117 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerDied","Data":"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa"} Apr 21 04:00:05.363304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363158 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.363304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363174 2580 scope.go:117] "RemoveContainer" containerID="4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" Apr 21 04:00:05.363304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363162 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerDied","Data":"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927"} Apr 21 04:00:05.363304 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363300 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerDied","Data":"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a"} Apr 21 04:00:05.363592 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363318 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerDied","Data":"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079"} Apr 21 04:00:05.363592 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363333 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerDied","Data":"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da"} Apr 21 04:00:05.363592 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363348 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerDied","Data":"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07"} Apr 21 04:00:05.363592 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.363362 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"27127b19-097c-4c7f-a91d-7309ae32d53a","Type":"ContainerDied","Data":"a02b1867db918fff224aeea2005432f574da0dd19d203fdcd1ea5b530f91821c"} Apr 21 04:00:05.366640 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366614 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzhtt\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-kube-api-access-lzhtt\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366640 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366639 2580 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-config\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366650 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-metrics-client-ca\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366660 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366670 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/27127b19-097c-4c7f-a91d-7309ae32d53a-tls-assets\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366679 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366688 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366697 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27127b19-097c-4c7f-a91d-7309ae32d53a-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366707 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-prometheus-k8s-db\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366715 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366724 2580 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-metrics-client-certs\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366732 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/27127b19-097c-4c7f-a91d-7309ae32d53a-config-out\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366743 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-web-config\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366751 2580 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-kube-rbac-proxy\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366759 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366768 2580 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-grpc-tls\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366777 2580 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.366808 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.366785 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/27127b19-097c-4c7f-a91d-7309ae32d53a-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:00:05.370609 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.370590 2580 scope.go:117] "RemoveContainer" containerID="a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" Apr 21 04:00:05.377514 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.377498 2580 scope.go:117] "RemoveContainer" containerID="b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" Apr 21 04:00:05.383950 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.383931 2580 scope.go:117] "RemoveContainer" containerID="1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" Apr 21 04:00:05.384767 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.384743 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:00:05.388877 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.388858 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:00:05.391421 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.391404 2580 scope.go:117] "RemoveContainer" containerID="9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" Apr 21 04:00:05.397477 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.397459 2580 scope.go:117] "RemoveContainer" containerID="0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" Apr 21 04:00:05.404115 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.404068 2580 scope.go:117] "RemoveContainer" containerID="208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420" Apr 21 04:00:05.410126 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.410061 2580 scope.go:117] "RemoveContainer" containerID="4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" Apr 21 04:00:05.410321 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:05.410304 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": container with ID starting with 4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa not found: ID does not exist" containerID="4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" Apr 21 04:00:05.410369 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.410328 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa"} err="failed to get container status \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": rpc error: code = NotFound desc = could not find container \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": container with ID starting with 4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa not found: ID does not exist" Apr 21 04:00:05.410369 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.410346 2580 scope.go:117] "RemoveContainer" containerID="a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" Apr 21 04:00:05.410542 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:05.410527 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": container with ID starting with a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927 not found: ID does not exist" containerID="a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" Apr 21 04:00:05.410577 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.410546 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927"} err="failed to get container status \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": rpc error: code = NotFound desc = could not find container \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": container with ID starting with a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927 not found: ID does not exist" Apr 21 04:00:05.410577 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.410557 2580 scope.go:117] "RemoveContainer" containerID="b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" Apr 21 04:00:05.410737 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:05.410723 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": container with ID starting with b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a not found: ID does not exist" containerID="b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" Apr 21 04:00:05.410774 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.410741 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a"} err="failed to get container status \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": rpc error: code = NotFound desc = could not find container \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": container with ID starting with b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a not found: ID does not exist" Apr 21 04:00:05.410774 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.410752 2580 scope.go:117] "RemoveContainer" containerID="1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" Apr 21 04:00:05.410913 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:05.410899 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": container with ID starting with 1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079 not found: ID does not exist" containerID="1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" Apr 21 04:00:05.410950 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.410916 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079"} err="failed to get container status \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": rpc error: code = NotFound desc = could not find container \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": container with ID starting with 1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079 not found: ID does not exist" Apr 21 04:00:05.410950 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.410927 2580 scope.go:117] "RemoveContainer" containerID="9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" Apr 21 04:00:05.411138 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:05.411120 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": container with ID starting with 9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da not found: ID does not exist" containerID="9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" Apr 21 04:00:05.411187 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.411141 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da"} err="failed to get container status \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": rpc error: code = NotFound desc = could not find container \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": container with ID starting with 9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da not found: ID does not exist" Apr 21 04:00:05.411187 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.411154 2580 scope.go:117] "RemoveContainer" containerID="0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" Apr 21 04:00:05.411352 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:05.411337 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": container with ID starting with 0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07 not found: ID does not exist" containerID="0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" Apr 21 04:00:05.411398 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.411355 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07"} err="failed to get container status \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": rpc error: code = NotFound desc = could not find container \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": container with ID starting with 0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07 not found: ID does not exist" Apr 21 04:00:05.411398 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.411369 2580 scope.go:117] "RemoveContainer" containerID="208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420" Apr 21 04:00:05.411591 ip-10-0-128-88 kubenswrapper[2580]: E0421 04:00:05.411571 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": container with ID starting with 208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420 not found: ID does not exist" containerID="208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420" Apr 21 04:00:05.411650 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.411601 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420"} err="failed to get container status \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": rpc error: code = NotFound desc = could not find container \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": container with ID starting with 208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420 not found: ID does not exist" Apr 21 04:00:05.411650 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.411625 2580 scope.go:117] "RemoveContainer" containerID="4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" Apr 21 04:00:05.411918 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.411896 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa"} err="failed to get container status \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": rpc error: code = NotFound desc = could not find container \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": container with ID starting with 4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa not found: ID does not exist" Apr 21 04:00:05.411918 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.411918 2580 scope.go:117] "RemoveContainer" containerID="a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" Apr 21 04:00:05.412286 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.412217 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927"} err="failed to get container status \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": rpc error: code = NotFound desc = could not find container \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": container with ID starting with a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927 not found: ID does not exist" Apr 21 04:00:05.412286 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.412254 2580 scope.go:117] "RemoveContainer" containerID="b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" Apr 21 04:00:05.412617 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.412592 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a"} err="failed to get container status \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": rpc error: code = NotFound desc = could not find container \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": container with ID starting with b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a not found: ID does not exist" Apr 21 04:00:05.412715 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.412620 2580 scope.go:117] "RemoveContainer" containerID="1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" Apr 21 04:00:05.412942 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.412915 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079"} err="failed to get container status \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": rpc error: code = NotFound desc = could not find container \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": container with ID starting with 1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079 not found: ID does not exist" Apr 21 04:00:05.412942 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.412941 2580 scope.go:117] "RemoveContainer" containerID="9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" Apr 21 04:00:05.413311 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.413289 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da"} err="failed to get container status \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": rpc error: code = NotFound desc = could not find container \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": container with ID starting with 9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da not found: ID does not exist" Apr 21 04:00:05.413311 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.413312 2580 scope.go:117] "RemoveContainer" containerID="0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" Apr 21 04:00:05.413546 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.413528 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07"} err="failed to get container status \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": rpc error: code = NotFound desc = could not find container \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": container with ID starting with 0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07 not found: ID does not exist" Apr 21 04:00:05.413598 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.413548 2580 scope.go:117] "RemoveContainer" containerID="208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420" Apr 21 04:00:05.413747 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.413731 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420"} err="failed to get container status \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": rpc error: code = NotFound desc = could not find container \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": container with ID starting with 208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420 not found: ID does not exist" Apr 21 04:00:05.413806 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.413747 2580 scope.go:117] "RemoveContainer" containerID="4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" Apr 21 04:00:05.414002 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.413986 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa"} err="failed to get container status \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": rpc error: code = NotFound desc = could not find container \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": container with ID starting with 4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa not found: ID does not exist" Apr 21 04:00:05.414002 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414002 2580 scope.go:117] "RemoveContainer" containerID="a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" Apr 21 04:00:05.414269 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414250 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927"} err="failed to get container status \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": rpc error: code = NotFound desc = could not find container \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": container with ID starting with a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927 not found: ID does not exist" Apr 21 04:00:05.414327 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414271 2580 scope.go:117] "RemoveContainer" containerID="b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" Apr 21 04:00:05.414366 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414319 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:00:05.414502 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414479 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a"} err="failed to get container status \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": rpc error: code = NotFound desc = could not find container \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": container with ID starting with b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a not found: ID does not exist" Apr 21 04:00:05.414573 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414504 2580 scope.go:117] "RemoveContainer" containerID="1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" Apr 21 04:00:05.414689 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414672 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079"} err="failed to get container status \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": rpc error: code = NotFound desc = could not find container \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": container with ID starting with 1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079 not found: ID does not exist" Apr 21 04:00:05.414735 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414697 2580 scope.go:117] "RemoveContainer" containerID="9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" Apr 21 04:00:05.414735 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414676 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="prometheus" Apr 21 04:00:05.414800 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414743 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="prometheus" Apr 21 04:00:05.414800 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414758 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="thanos-sidecar" Apr 21 04:00:05.414800 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414767 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="thanos-sidecar" Apr 21 04:00:05.414800 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414775 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="prom-label-proxy" Apr 21 04:00:05.414800 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414781 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="prom-label-proxy" Apr 21 04:00:05.414800 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414792 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="alertmanager" Apr 21 04:00:05.414800 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414798 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="alertmanager" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414803 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy-web" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414809 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy-web" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414818 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="config-reloader" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414825 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="config-reloader" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414835 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="config-reloader" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414843 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="config-reloader" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414853 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414861 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414873 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy-metric" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414878 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy-metric" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414884 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy-thanos" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414892 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy-thanos" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414906 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="init-config-reloader" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414911 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="init-config-reloader" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414918 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="init-config-reloader" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414923 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="init-config-reloader" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414931 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414939 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414951 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy-web" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414960 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy-web" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414955 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da"} err="failed to get container status \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": rpc error: code = NotFound desc = could not find container \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": container with ID starting with 9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da not found: ID does not exist" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.414982 2580 scope.go:117] "RemoveContainer" containerID="0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415043 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy-thanos" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415054 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy" Apr 21 04:00:05.415049 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415060 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy-web" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415067 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="prometheus" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415073 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy-web" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415095 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="kube-rbac-proxy" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415105 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="alertmanager" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415113 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="config-reloader" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415120 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="prom-label-proxy" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415131 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="thanos-sidecar" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415141 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa0455ea-62bc-4b3e-a367-2cd821d5fa11" containerName="kube-rbac-proxy-metric" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415148 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" containerName="config-reloader" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415201 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07"} err="failed to get container status \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": rpc error: code = NotFound desc = could not find container \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": container with ID starting with 0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07 not found: ID does not exist" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415220 2580 scope.go:117] "RemoveContainer" containerID="208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415485 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420"} err="failed to get container status \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": rpc error: code = NotFound desc = could not find container \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": container with ID starting with 208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420 not found: ID does not exist" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415509 2580 scope.go:117] "RemoveContainer" containerID="4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415744 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa"} err="failed to get container status \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": rpc error: code = NotFound desc = could not find container \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": container with ID starting with 4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa not found: ID does not exist" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415759 2580 scope.go:117] "RemoveContainer" containerID="a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415960 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927"} err="failed to get container status \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": rpc error: code = NotFound desc = could not find container \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": container with ID starting with a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927 not found: ID does not exist" Apr 21 04:00:05.415994 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.415981 2580 scope.go:117] "RemoveContainer" containerID="b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" Apr 21 04:00:05.416624 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.416219 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a"} err="failed to get container status \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": rpc error: code = NotFound desc = could not find container \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": container with ID starting with b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a not found: ID does not exist" Apr 21 04:00:05.416624 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.416235 2580 scope.go:117] "RemoveContainer" containerID="1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" Apr 21 04:00:05.416624 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.416426 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079"} err="failed to get container status \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": rpc error: code = NotFound desc = could not find container \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": container with ID starting with 1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079 not found: ID does not exist" Apr 21 04:00:05.416624 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.416442 2580 scope.go:117] "RemoveContainer" containerID="9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" Apr 21 04:00:05.416757 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.416646 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da"} err="failed to get container status \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": rpc error: code = NotFound desc = could not find container \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": container with ID starting with 9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da not found: ID does not exist" Apr 21 04:00:05.416757 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.416663 2580 scope.go:117] "RemoveContainer" containerID="0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" Apr 21 04:00:05.416847 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.416832 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07"} err="failed to get container status \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": rpc error: code = NotFound desc = could not find container \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": container with ID starting with 0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07 not found: ID does not exist" Apr 21 04:00:05.416883 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.416849 2580 scope.go:117] "RemoveContainer" containerID="208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420" Apr 21 04:00:05.417054 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417035 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420"} err="failed to get container status \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": rpc error: code = NotFound desc = could not find container \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": container with ID starting with 208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420 not found: ID does not exist" Apr 21 04:00:05.417134 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417054 2580 scope.go:117] "RemoveContainer" containerID="4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" Apr 21 04:00:05.417298 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417282 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa"} err="failed to get container status \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": rpc error: code = NotFound desc = could not find container \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": container with ID starting with 4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa not found: ID does not exist" Apr 21 04:00:05.417338 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417298 2580 scope.go:117] "RemoveContainer" containerID="a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" Apr 21 04:00:05.417515 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417498 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927"} err="failed to get container status \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": rpc error: code = NotFound desc = could not find container \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": container with ID starting with a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927 not found: ID does not exist" Apr 21 04:00:05.417553 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417517 2580 scope.go:117] "RemoveContainer" containerID="b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" Apr 21 04:00:05.417720 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417704 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a"} err="failed to get container status \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": rpc error: code = NotFound desc = could not find container \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": container with ID starting with b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a not found: ID does not exist" Apr 21 04:00:05.417774 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417719 2580 scope.go:117] "RemoveContainer" containerID="1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" Apr 21 04:00:05.417932 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417915 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079"} err="failed to get container status \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": rpc error: code = NotFound desc = could not find container \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": container with ID starting with 1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079 not found: ID does not exist" Apr 21 04:00:05.417969 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.417933 2580 scope.go:117] "RemoveContainer" containerID="9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" Apr 21 04:00:05.418148 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418131 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da"} err="failed to get container status \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": rpc error: code = NotFound desc = could not find container \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": container with ID starting with 9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da not found: ID does not exist" Apr 21 04:00:05.418191 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418148 2580 scope.go:117] "RemoveContainer" containerID="0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" Apr 21 04:00:05.418339 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418324 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07"} err="failed to get container status \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": rpc error: code = NotFound desc = could not find container \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": container with ID starting with 0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07 not found: ID does not exist" Apr 21 04:00:05.418426 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418340 2580 scope.go:117] "RemoveContainer" containerID="208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420" Apr 21 04:00:05.418530 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418513 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420"} err="failed to get container status \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": rpc error: code = NotFound desc = could not find container \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": container with ID starting with 208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420 not found: ID does not exist" Apr 21 04:00:05.418572 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418530 2580 scope.go:117] "RemoveContainer" containerID="4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa" Apr 21 04:00:05.418729 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418712 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa"} err="failed to get container status \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": rpc error: code = NotFound desc = could not find container \"4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa\": container with ID starting with 4da31c56bff7cd5c903e4d5694bdebc000c1662166cd017ba519c49ad42c6faa not found: ID does not exist" Apr 21 04:00:05.418787 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418728 2580 scope.go:117] "RemoveContainer" containerID="a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927" Apr 21 04:00:05.418922 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418906 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927"} err="failed to get container status \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": rpc error: code = NotFound desc = could not find container \"a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927\": container with ID starting with a29c4b0628cc4223ca7a3f8db496c9f0800bf429801ba80bf54121da66969927 not found: ID does not exist" Apr 21 04:00:05.418970 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.418923 2580 scope.go:117] "RemoveContainer" containerID="b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a" Apr 21 04:00:05.419205 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.419189 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a"} err="failed to get container status \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": rpc error: code = NotFound desc = could not find container \"b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a\": container with ID starting with b2d9c9eb6090a992bace1599c9ad9e1950ea7dbda6f3a3a5e55edc757a9c767a not found: ID does not exist" Apr 21 04:00:05.419251 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.419206 2580 scope.go:117] "RemoveContainer" containerID="1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079" Apr 21 04:00:05.419387 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.419371 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079"} err="failed to get container status \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": rpc error: code = NotFound desc = could not find container \"1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079\": container with ID starting with 1fdda02049ea7167dd2d99f8bd21eaa2c7b9bcec9dd3369db8f49b56e7cf2079 not found: ID does not exist" Apr 21 04:00:05.419427 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.419387 2580 scope.go:117] "RemoveContainer" containerID="9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da" Apr 21 04:00:05.419582 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.419566 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da"} err="failed to get container status \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": rpc error: code = NotFound desc = could not find container \"9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da\": container with ID starting with 9259dbdf9dca07671a14915cb319304c2d9b3968513c41d06a017611977319da not found: ID does not exist" Apr 21 04:00:05.419625 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.419582 2580 scope.go:117] "RemoveContainer" containerID="0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07" Apr 21 04:00:05.419778 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.419762 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07"} err="failed to get container status \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": rpc error: code = NotFound desc = could not find container \"0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07\": container with ID starting with 0a8304c908eaa3dacd4b3e64f2ef1bd3f287ff3afd09b09f890d845059bf0d07 not found: ID does not exist" Apr 21 04:00:05.419823 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.419778 2580 scope.go:117] "RemoveContainer" containerID="208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420" Apr 21 04:00:05.419959 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.419941 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420"} err="failed to get container status \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": rpc error: code = NotFound desc = could not find container \"208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420\": container with ID starting with 208c35a37797145118f1a4e6abe8b698075e42bf3a7e7b3e2dc01d46ba73a420 not found: ID does not exist" Apr 21 04:00:05.420518 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.420504 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.422466 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.422434 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7tbhb\"" Apr 21 04:00:05.422645 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.422434 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 04:00:05.422645 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.422433 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 04:00:05.422645 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.422439 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 04:00:05.422645 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.422497 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 04:00:05.423204 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.423183 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cu7ihvba280bf\"" Apr 21 04:00:05.423447 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.423429 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 04:00:05.423773 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.423753 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 04:00:05.424401 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.424257 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 04:00:05.425576 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.425553 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 04:00:05.425861 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.425841 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 04:00:05.429313 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.426214 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 04:00:05.432428 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.431864 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 04:00:05.432428 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.432266 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:00:05.434614 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.434595 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 04:00:05.568278 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568175 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568278 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568278 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568279 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568522 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568313 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-config\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568522 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568340 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-web-config\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568522 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568371 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568522 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568389 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568522 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/05126879-231f-4438-a109-69892c129bfd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568522 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568497 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/05126879-231f-4438-a109-69892c129bfd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568522 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568516 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568824 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568540 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcw6z\" (UniqueName: \"kubernetes.io/projected/05126879-231f-4438-a109-69892c129bfd-kube-api-access-lcw6z\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568824 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568572 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568824 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568620 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568824 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568824 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568695 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568824 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568742 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/05126879-231f-4438-a109-69892c129bfd-config-out\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568824 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568781 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.568824 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.568821 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670171 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670135 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670171 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670174 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670361 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670301 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670361 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670343 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-config\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670468 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670448 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-web-config\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670515 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670488 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670566 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670523 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670616 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/05126879-231f-4438-a109-69892c129bfd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670616 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670602 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/05126879-231f-4438-a109-69892c129bfd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670715 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670627 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670715 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcw6z\" (UniqueName: \"kubernetes.io/projected/05126879-231f-4438-a109-69892c129bfd-kube-api-access-lcw6z\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670715 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670688 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670873 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670715 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670873 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670743 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670873 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670771 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.670873 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670818 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/05126879-231f-4438-a109-69892c129bfd-config-out\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.671060 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.670930 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.671842 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.671796 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.672031 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.671979 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.672449 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.672427 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/05126879-231f-4438-a109-69892c129bfd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.673588 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.673235 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/05126879-231f-4438-a109-69892c129bfd-config-out\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.673588 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.673524 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-web-config\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.673755 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.673657 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.674380 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.673931 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-config\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.674380 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.673985 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.674380 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.674034 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.674380 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.674370 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.674635 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.674525 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.675254 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.675227 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.675401 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.675377 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.675498 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.675477 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/05126879-231f-4438-a109-69892c129bfd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.675802 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.675781 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.676553 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.676528 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.676718 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.676694 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/05126879-231f-4438-a109-69892c129bfd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.676835 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.676817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/05126879-231f-4438-a109-69892c129bfd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.677600 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.677584 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcw6z\" (UniqueName: \"kubernetes.io/projected/05126879-231f-4438-a109-69892c129bfd-kube-api-access-lcw6z\") pod \"prometheus-k8s-0\" (UID: \"05126879-231f-4438-a109-69892c129bfd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.705418 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.705384 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27127b19-097c-4c7f-a91d-7309ae32d53a" path="/var/lib/kubelet/pods/27127b19-097c-4c7f-a91d-7309ae32d53a/volumes" Apr 21 04:00:05.735008 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.734978 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:05.856010 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:05.855978 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:00:05.858452 ip-10-0-128-88 kubenswrapper[2580]: W0421 04:00:05.858425 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05126879_231f_4438_a109_69892c129bfd.slice/crio-1afc5c65c18a653774e2a658382241c6e97ce1292090224b60a77a1239b54bcd WatchSource:0}: Error finding container 1afc5c65c18a653774e2a658382241c6e97ce1292090224b60a77a1239b54bcd: Status 404 returned error can't find the container with id 1afc5c65c18a653774e2a658382241c6e97ce1292090224b60a77a1239b54bcd Apr 21 04:00:06.368518 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:06.368487 2580 generic.go:358] "Generic (PLEG): container finished" podID="05126879-231f-4438-a109-69892c129bfd" containerID="65df739a78fadca3dd82cec89e74a9670c862c732544c9de5b024869307190be" exitCode=0 Apr 21 04:00:06.368881 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:06.368544 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"05126879-231f-4438-a109-69892c129bfd","Type":"ContainerDied","Data":"65df739a78fadca3dd82cec89e74a9670c862c732544c9de5b024869307190be"} Apr 21 04:00:06.368881 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:06.368564 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"05126879-231f-4438-a109-69892c129bfd","Type":"ContainerStarted","Data":"1afc5c65c18a653774e2a658382241c6e97ce1292090224b60a77a1239b54bcd"} Apr 21 04:00:07.379282 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:07.379218 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"05126879-231f-4438-a109-69892c129bfd","Type":"ContainerStarted","Data":"e25d134c3a9dc3e13b8ea5041803267289c8f4c2aa558de04a3eb710831b2ffe"} Apr 21 04:00:07.379731 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:07.379304 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"05126879-231f-4438-a109-69892c129bfd","Type":"ContainerStarted","Data":"a40b933bb920e84b8002f60f8bd6eec13b8cc508bd7638ee11cfa243186426ba"} Apr 21 04:00:07.379731 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:07.379324 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"05126879-231f-4438-a109-69892c129bfd","Type":"ContainerStarted","Data":"1125e9ce618e836ab83b228e67be1a6fd22bd0312649638547c7e44533acfe06"} Apr 21 04:00:07.379731 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:07.379340 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"05126879-231f-4438-a109-69892c129bfd","Type":"ContainerStarted","Data":"4ff25331f3824bd5a87b1653cf5bc7e2c526ac7e9c0c96181d801f9e7bb6fded"} Apr 21 04:00:07.379731 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:07.379353 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"05126879-231f-4438-a109-69892c129bfd","Type":"ContainerStarted","Data":"c2d3b5ea7995d244b9a6fa1ca53f4b3dd57033f7c992fc4dadc1498c2d6bd746"} Apr 21 04:00:07.379731 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:07.379372 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"05126879-231f-4438-a109-69892c129bfd","Type":"ContainerStarted","Data":"3ddd6cb63ba816ccef86935be432a06609fccd8982c12808d8280f84f37cec64"} Apr 21 04:00:07.415465 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:07.415415 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.415402166 podStartE2EDuration="2.415402166s" podCreationTimestamp="2026-04-21 04:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:00:07.413822162 +0000 UTC m=+164.284478112" watchObservedRunningTime="2026-04-21 04:00:07.415402166 +0000 UTC m=+164.286058115" Apr 21 04:00:10.735737 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:10.735647 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:57.309938 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.309902 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xb5d5"] Apr 21 04:00:57.313392 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.313374 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.314999 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.314977 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 04:00:57.318364 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.318341 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xb5d5"] Apr 21 04:00:57.492283 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.492249 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3cad3688-3f99-4785-ae07-f31cc486dd1d-dbus\") pod \"global-pull-secret-syncer-xb5d5\" (UID: \"3cad3688-3f99-4785-ae07-f31cc486dd1d\") " pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.492451 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.492298 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3cad3688-3f99-4785-ae07-f31cc486dd1d-original-pull-secret\") pod \"global-pull-secret-syncer-xb5d5\" (UID: \"3cad3688-3f99-4785-ae07-f31cc486dd1d\") " pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.492451 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.492382 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3cad3688-3f99-4785-ae07-f31cc486dd1d-kubelet-config\") pod \"global-pull-secret-syncer-xb5d5\" (UID: \"3cad3688-3f99-4785-ae07-f31cc486dd1d\") " pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.593448 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.593349 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3cad3688-3f99-4785-ae07-f31cc486dd1d-original-pull-secret\") pod \"global-pull-secret-syncer-xb5d5\" (UID: \"3cad3688-3f99-4785-ae07-f31cc486dd1d\") " pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.593448 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.593392 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3cad3688-3f99-4785-ae07-f31cc486dd1d-kubelet-config\") pod \"global-pull-secret-syncer-xb5d5\" (UID: \"3cad3688-3f99-4785-ae07-f31cc486dd1d\") " pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.593448 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.593448 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3cad3688-3f99-4785-ae07-f31cc486dd1d-dbus\") pod \"global-pull-secret-syncer-xb5d5\" (UID: \"3cad3688-3f99-4785-ae07-f31cc486dd1d\") " pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.593729 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.593551 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3cad3688-3f99-4785-ae07-f31cc486dd1d-kubelet-config\") pod \"global-pull-secret-syncer-xb5d5\" (UID: \"3cad3688-3f99-4785-ae07-f31cc486dd1d\") " pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.593729 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.593573 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3cad3688-3f99-4785-ae07-f31cc486dd1d-dbus\") pod \"global-pull-secret-syncer-xb5d5\" (UID: \"3cad3688-3f99-4785-ae07-f31cc486dd1d\") " pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.595751 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.595723 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3cad3688-3f99-4785-ae07-f31cc486dd1d-original-pull-secret\") pod \"global-pull-secret-syncer-xb5d5\" (UID: \"3cad3688-3f99-4785-ae07-f31cc486dd1d\") " pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.622769 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.622747 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xb5d5" Apr 21 04:00:57.742479 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:57.742453 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xb5d5"] Apr 21 04:00:57.744947 ip-10-0-128-88 kubenswrapper[2580]: W0421 04:00:57.744912 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cad3688_3f99_4785_ae07_f31cc486dd1d.slice/crio-da8a084a5e67e549bad598e23f3335b17f5956ade208cd520d48b0acef0283eb WatchSource:0}: Error finding container da8a084a5e67e549bad598e23f3335b17f5956ade208cd520d48b0acef0283eb: Status 404 returned error can't find the container with id da8a084a5e67e549bad598e23f3335b17f5956ade208cd520d48b0acef0283eb Apr 21 04:00:58.528882 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:00:58.528842 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xb5d5" event={"ID":"3cad3688-3f99-4785-ae07-f31cc486dd1d","Type":"ContainerStarted","Data":"da8a084a5e67e549bad598e23f3335b17f5956ade208cd520d48b0acef0283eb"} Apr 21 04:01:02.541065 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:01:02.541028 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xb5d5" event={"ID":"3cad3688-3f99-4785-ae07-f31cc486dd1d","Type":"ContainerStarted","Data":"711527284d8f089991e787b4debfcb30b54b4f08973996e77a39d0855add5dcf"} Apr 21 04:01:02.555099 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:01:02.555033 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xb5d5" podStartSLOduration=1.8015695699999998 podStartE2EDuration="5.555020061s" podCreationTimestamp="2026-04-21 04:00:57 +0000 UTC" firstStartedPulling="2026-04-21 04:00:57.74694905 +0000 UTC m=+214.617604978" lastFinishedPulling="2026-04-21 04:01:01.500399526 +0000 UTC m=+218.371055469" observedRunningTime="2026-04-21 04:01:02.553133426 +0000 UTC m=+219.423789376" watchObservedRunningTime="2026-04-21 04:01:02.555020061 +0000 UTC m=+219.425676012" Apr 21 04:01:05.735353 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:01:05.735319 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:05.750151 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:01:05.750126 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:06.566436 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:01:06.566402 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:02:23.591534 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:02:23.591509 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:04:17.045839 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.045805 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-hxhzt"] Apr 21 04:04:17.048054 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.048035 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hxhzt" Apr 21 04:04:17.050002 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.049965 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 04:04:17.050002 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.049997 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 21 04:04:17.050188 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.049997 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 04:04:17.050447 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.050430 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qsts\"" Apr 21 04:04:17.054310 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.054288 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hxhzt"] Apr 21 04:04:17.119104 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.119052 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk96c\" (UniqueName: \"kubernetes.io/projected/d1e7cee8-08a0-478f-b275-84026323bd2f-kube-api-access-tk96c\") pod \"s3-init-hxhzt\" (UID: \"d1e7cee8-08a0-478f-b275-84026323bd2f\") " pod="kserve/s3-init-hxhzt" Apr 21 04:04:17.219991 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.219952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk96c\" (UniqueName: \"kubernetes.io/projected/d1e7cee8-08a0-478f-b275-84026323bd2f-kube-api-access-tk96c\") pod \"s3-init-hxhzt\" (UID: \"d1e7cee8-08a0-478f-b275-84026323bd2f\") " pod="kserve/s3-init-hxhzt" Apr 21 04:04:17.227827 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.227797 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk96c\" (UniqueName: \"kubernetes.io/projected/d1e7cee8-08a0-478f-b275-84026323bd2f-kube-api-access-tk96c\") pod \"s3-init-hxhzt\" (UID: \"d1e7cee8-08a0-478f-b275-84026323bd2f\") " pod="kserve/s3-init-hxhzt" Apr 21 04:04:17.368574 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.368501 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hxhzt" Apr 21 04:04:17.482476 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.482454 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hxhzt"] Apr 21 04:04:17.485141 ip-10-0-128-88 kubenswrapper[2580]: W0421 04:04:17.485110 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e7cee8_08a0_478f_b275_84026323bd2f.slice/crio-e272aaa271a2cec3a3ca9f93d76e8fe3a7b8513083124201441c2ae051fc7889 WatchSource:0}: Error finding container e272aaa271a2cec3a3ca9f93d76e8fe3a7b8513083124201441c2ae051fc7889: Status 404 returned error can't find the container with id e272aaa271a2cec3a3ca9f93d76e8fe3a7b8513083124201441c2ae051fc7889 Apr 21 04:04:17.487211 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:17.487196 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:04:18.074043 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:18.073990 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hxhzt" event={"ID":"d1e7cee8-08a0-478f-b275-84026323bd2f","Type":"ContainerStarted","Data":"e272aaa271a2cec3a3ca9f93d76e8fe3a7b8513083124201441c2ae051fc7889"} Apr 21 04:04:22.093162 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:22.093126 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hxhzt" event={"ID":"d1e7cee8-08a0-478f-b275-84026323bd2f","Type":"ContainerStarted","Data":"293bb7a88e3eaf3a0b9345ccd9a3fd6a08bb0f2993d0a2dc7af76f712b2c2025"} Apr 21 04:04:22.106561 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:22.106518 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-hxhzt" podStartSLOduration=0.698212795 podStartE2EDuration="5.106505934s" podCreationTimestamp="2026-04-21 04:04:17 +0000 UTC" firstStartedPulling="2026-04-21 04:04:17.487322323 +0000 UTC m=+414.357978251" lastFinishedPulling="2026-04-21 04:04:21.895615462 +0000 UTC m=+418.766271390" observedRunningTime="2026-04-21 04:04:22.105396532 +0000 UTC m=+418.976052482" watchObservedRunningTime="2026-04-21 04:04:22.106505934 +0000 UTC m=+418.977161884" Apr 21 04:04:25.104796 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:25.104743 2580 generic.go:358] "Generic (PLEG): container finished" podID="d1e7cee8-08a0-478f-b275-84026323bd2f" containerID="293bb7a88e3eaf3a0b9345ccd9a3fd6a08bb0f2993d0a2dc7af76f712b2c2025" exitCode=0 Apr 21 04:04:25.105177 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:25.104815 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hxhzt" event={"ID":"d1e7cee8-08a0-478f-b275-84026323bd2f","Type":"ContainerDied","Data":"293bb7a88e3eaf3a0b9345ccd9a3fd6a08bb0f2993d0a2dc7af76f712b2c2025"} Apr 21 04:04:26.228246 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:26.228225 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hxhzt" Apr 21 04:04:26.282529 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:26.282501 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk96c\" (UniqueName: \"kubernetes.io/projected/d1e7cee8-08a0-478f-b275-84026323bd2f-kube-api-access-tk96c\") pod \"d1e7cee8-08a0-478f-b275-84026323bd2f\" (UID: \"d1e7cee8-08a0-478f-b275-84026323bd2f\") " Apr 21 04:04:26.284485 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:26.284453 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e7cee8-08a0-478f-b275-84026323bd2f-kube-api-access-tk96c" (OuterVolumeSpecName: "kube-api-access-tk96c") pod "d1e7cee8-08a0-478f-b275-84026323bd2f" (UID: "d1e7cee8-08a0-478f-b275-84026323bd2f"). InnerVolumeSpecName "kube-api-access-tk96c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:04:26.383573 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:26.383495 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tk96c\" (UniqueName: \"kubernetes.io/projected/d1e7cee8-08a0-478f-b275-84026323bd2f-kube-api-access-tk96c\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:04:27.111307 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:27.111271 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hxhzt" event={"ID":"d1e7cee8-08a0-478f-b275-84026323bd2f","Type":"ContainerDied","Data":"e272aaa271a2cec3a3ca9f93d76e8fe3a7b8513083124201441c2ae051fc7889"} Apr 21 04:04:27.111307 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:27.111308 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e272aaa271a2cec3a3ca9f93d76e8fe3a7b8513083124201441c2ae051fc7889" Apr 21 04:04:27.111307 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:27.111280 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hxhzt" Apr 21 04:04:34.586939 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.586903 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-m6ghx"] Apr 21 04:04:34.587427 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.587218 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1e7cee8-08a0-478f-b275-84026323bd2f" containerName="s3-init" Apr 21 04:04:34.587427 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.587230 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e7cee8-08a0-478f-b275-84026323bd2f" containerName="s3-init" Apr 21 04:04:34.587427 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.587288 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1e7cee8-08a0-478f-b275-84026323bd2f" containerName="s3-init" Apr 21 04:04:34.589226 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.589210 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-m6ghx" Apr 21 04:04:34.591218 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.591196 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 04:04:34.591344 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.591224 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 04:04:34.591344 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.591256 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qsts\"" Apr 21 04:04:34.591748 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.591733 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 21 04:04:34.594974 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.594951 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-m6ghx"] Apr 21 04:04:34.647220 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.647185 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq9m4\" (UniqueName: \"kubernetes.io/projected/99b8381b-cc3a-4b62-bb24-d62b760ce5b0-kube-api-access-pq9m4\") pod \"s3-tls-init-custom-m6ghx\" (UID: \"99b8381b-cc3a-4b62-bb24-d62b760ce5b0\") " pod="kserve/s3-tls-init-custom-m6ghx" Apr 21 04:04:34.748408 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.748371 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pq9m4\" (UniqueName: \"kubernetes.io/projected/99b8381b-cc3a-4b62-bb24-d62b760ce5b0-kube-api-access-pq9m4\") pod \"s3-tls-init-custom-m6ghx\" (UID: \"99b8381b-cc3a-4b62-bb24-d62b760ce5b0\") " pod="kserve/s3-tls-init-custom-m6ghx" Apr 21 04:04:34.759161 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.759129 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq9m4\" (UniqueName: \"kubernetes.io/projected/99b8381b-cc3a-4b62-bb24-d62b760ce5b0-kube-api-access-pq9m4\") pod \"s3-tls-init-custom-m6ghx\" (UID: \"99b8381b-cc3a-4b62-bb24-d62b760ce5b0\") " pod="kserve/s3-tls-init-custom-m6ghx" Apr 21 04:04:34.913941 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:34.913849 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-m6ghx" Apr 21 04:04:35.030360 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:35.030336 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-m6ghx"] Apr 21 04:04:35.032798 ip-10-0-128-88 kubenswrapper[2580]: W0421 04:04:35.032766 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99b8381b_cc3a_4b62_bb24_d62b760ce5b0.slice/crio-2c5c6aad7435b09d94da90af59ec6bd98357e03a9c0edc8809ac71d11e19a3a3 WatchSource:0}: Error finding container 2c5c6aad7435b09d94da90af59ec6bd98357e03a9c0edc8809ac71d11e19a3a3: Status 404 returned error can't find the container with id 2c5c6aad7435b09d94da90af59ec6bd98357e03a9c0edc8809ac71d11e19a3a3 Apr 21 04:04:35.135562 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:35.135531 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-m6ghx" event={"ID":"99b8381b-cc3a-4b62-bb24-d62b760ce5b0","Type":"ContainerStarted","Data":"5807caa1f0302eea0eedaa1e980d74336dfddfd745de5e0b0d8601f543c28aa9"} Apr 21 04:04:35.135562 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:35.135565 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-m6ghx" event={"ID":"99b8381b-cc3a-4b62-bb24-d62b760ce5b0","Type":"ContainerStarted","Data":"2c5c6aad7435b09d94da90af59ec6bd98357e03a9c0edc8809ac71d11e19a3a3"} Apr 21 04:04:35.149847 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:35.149804 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-m6ghx" podStartSLOduration=1.149791009 podStartE2EDuration="1.149791009s" podCreationTimestamp="2026-04-21 04:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:04:35.149101957 +0000 UTC m=+432.019757905" watchObservedRunningTime="2026-04-21 04:04:35.149791009 +0000 UTC m=+432.020446958" Apr 21 04:04:41.152724 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:41.152688 2580 generic.go:358] "Generic (PLEG): container finished" podID="99b8381b-cc3a-4b62-bb24-d62b760ce5b0" containerID="5807caa1f0302eea0eedaa1e980d74336dfddfd745de5e0b0d8601f543c28aa9" exitCode=0 Apr 21 04:04:41.153102 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:41.152742 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-m6ghx" event={"ID":"99b8381b-cc3a-4b62-bb24-d62b760ce5b0","Type":"ContainerDied","Data":"5807caa1f0302eea0eedaa1e980d74336dfddfd745de5e0b0d8601f543c28aa9"} Apr 21 04:04:42.279449 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:42.279426 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-m6ghx" Apr 21 04:04:42.308922 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:42.308894 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq9m4\" (UniqueName: \"kubernetes.io/projected/99b8381b-cc3a-4b62-bb24-d62b760ce5b0-kube-api-access-pq9m4\") pod \"99b8381b-cc3a-4b62-bb24-d62b760ce5b0\" (UID: \"99b8381b-cc3a-4b62-bb24-d62b760ce5b0\") " Apr 21 04:04:42.310989 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:42.310966 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b8381b-cc3a-4b62-bb24-d62b760ce5b0-kube-api-access-pq9m4" (OuterVolumeSpecName: "kube-api-access-pq9m4") pod "99b8381b-cc3a-4b62-bb24-d62b760ce5b0" (UID: "99b8381b-cc3a-4b62-bb24-d62b760ce5b0"). InnerVolumeSpecName "kube-api-access-pq9m4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:04:42.409899 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:42.409825 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pq9m4\" (UniqueName: \"kubernetes.io/projected/99b8381b-cc3a-4b62-bb24-d62b760ce5b0-kube-api-access-pq9m4\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:04:43.159097 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:43.159058 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-m6ghx" Apr 21 04:04:43.159256 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:43.159057 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-m6ghx" event={"ID":"99b8381b-cc3a-4b62-bb24-d62b760ce5b0","Type":"ContainerDied","Data":"2c5c6aad7435b09d94da90af59ec6bd98357e03a9c0edc8809ac71d11e19a3a3"} Apr 21 04:04:43.159256 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:43.159180 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c5c6aad7435b09d94da90af59ec6bd98357e03a9c0edc8809ac71d11e19a3a3" Apr 21 04:04:45.847936 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.847896 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-stjks"] Apr 21 04:04:45.848414 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.848385 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99b8381b-cc3a-4b62-bb24-d62b760ce5b0" containerName="s3-tls-init-custom" Apr 21 04:04:45.848414 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.848404 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b8381b-cc3a-4b62-bb24-d62b760ce5b0" containerName="s3-tls-init-custom" Apr 21 04:04:45.848527 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.848492 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="99b8381b-cc3a-4b62-bb24-d62b760ce5b0" containerName="s3-tls-init-custom" Apr 21 04:04:45.851549 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.851530 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-stjks" Apr 21 04:04:45.853349 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.853323 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 04:04:45.853458 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.853323 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4qsts\"" Apr 21 04:04:45.853458 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.853384 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 21 04:04:45.853649 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.853635 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 04:04:45.855859 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.855840 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-stjks"] Apr 21 04:04:45.937497 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:45.937456 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9s7s\" (UniqueName: \"kubernetes.io/projected/abb42870-422c-4a56-81c9-f3064514ebb8-kube-api-access-k9s7s\") pod \"s3-tls-init-serving-stjks\" (UID: \"abb42870-422c-4a56-81c9-f3064514ebb8\") " pod="kserve/s3-tls-init-serving-stjks" Apr 21 04:04:46.038424 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:46.038372 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9s7s\" (UniqueName: \"kubernetes.io/projected/abb42870-422c-4a56-81c9-f3064514ebb8-kube-api-access-k9s7s\") pod \"s3-tls-init-serving-stjks\" (UID: \"abb42870-422c-4a56-81c9-f3064514ebb8\") " pod="kserve/s3-tls-init-serving-stjks" Apr 21 04:04:46.045698 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:46.045669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9s7s\" (UniqueName: \"kubernetes.io/projected/abb42870-422c-4a56-81c9-f3064514ebb8-kube-api-access-k9s7s\") pod \"s3-tls-init-serving-stjks\" (UID: \"abb42870-422c-4a56-81c9-f3064514ebb8\") " pod="kserve/s3-tls-init-serving-stjks" Apr 21 04:04:46.171149 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:46.171058 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-stjks" Apr 21 04:04:46.283475 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:46.283449 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-stjks"] Apr 21 04:04:46.285859 ip-10-0-128-88 kubenswrapper[2580]: W0421 04:04:46.285834 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabb42870_422c_4a56_81c9_f3064514ebb8.slice/crio-da60fd2c6f29cb4caace3403a3a12301cfb4aeebfe55c5d917cfe11c580a219a WatchSource:0}: Error finding container da60fd2c6f29cb4caace3403a3a12301cfb4aeebfe55c5d917cfe11c580a219a: Status 404 returned error can't find the container with id da60fd2c6f29cb4caace3403a3a12301cfb4aeebfe55c5d917cfe11c580a219a Apr 21 04:04:47.172370 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:47.172335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-stjks" event={"ID":"abb42870-422c-4a56-81c9-f3064514ebb8","Type":"ContainerStarted","Data":"d1ef7994b42d69870cecc13b1c1af76ee44510f1ccbe2878177686687b6cd40f"} Apr 21 04:04:47.172370 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:47.172372 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-stjks" event={"ID":"abb42870-422c-4a56-81c9-f3064514ebb8","Type":"ContainerStarted","Data":"da60fd2c6f29cb4caace3403a3a12301cfb4aeebfe55c5d917cfe11c580a219a"} Apr 21 04:04:47.184929 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:47.184879 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-stjks" podStartSLOduration=2.184862851 podStartE2EDuration="2.184862851s" podCreationTimestamp="2026-04-21 04:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:04:47.183868166 +0000 UTC m=+444.054524117" watchObservedRunningTime="2026-04-21 04:04:47.184862851 +0000 UTC m=+444.055518801" Apr 21 04:04:52.188135 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:52.188098 2580 generic.go:358] "Generic (PLEG): container finished" podID="abb42870-422c-4a56-81c9-f3064514ebb8" containerID="d1ef7994b42d69870cecc13b1c1af76ee44510f1ccbe2878177686687b6cd40f" exitCode=0 Apr 21 04:04:52.188540 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:52.188107 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-stjks" event={"ID":"abb42870-422c-4a56-81c9-f3064514ebb8","Type":"ContainerDied","Data":"d1ef7994b42d69870cecc13b1c1af76ee44510f1ccbe2878177686687b6cd40f"} Apr 21 04:04:53.318612 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:53.318592 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-stjks" Apr 21 04:04:53.395729 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:53.395702 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9s7s\" (UniqueName: \"kubernetes.io/projected/abb42870-422c-4a56-81c9-f3064514ebb8-kube-api-access-k9s7s\") pod \"abb42870-422c-4a56-81c9-f3064514ebb8\" (UID: \"abb42870-422c-4a56-81c9-f3064514ebb8\") " Apr 21 04:04:53.397635 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:53.397607 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb42870-422c-4a56-81c9-f3064514ebb8-kube-api-access-k9s7s" (OuterVolumeSpecName: "kube-api-access-k9s7s") pod "abb42870-422c-4a56-81c9-f3064514ebb8" (UID: "abb42870-422c-4a56-81c9-f3064514ebb8"). InnerVolumeSpecName "kube-api-access-k9s7s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:04:53.496341 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:53.496302 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k9s7s\" (UniqueName: \"kubernetes.io/projected/abb42870-422c-4a56-81c9-f3064514ebb8-kube-api-access-k9s7s\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 04:04:54.195253 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:54.195218 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-stjks" event={"ID":"abb42870-422c-4a56-81c9-f3064514ebb8","Type":"ContainerDied","Data":"da60fd2c6f29cb4caace3403a3a12301cfb4aeebfe55c5d917cfe11c580a219a"} Apr 21 04:04:54.195253 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:54.195250 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-stjks" Apr 21 04:04:54.195473 ip-10-0-128-88 kubenswrapper[2580]: I0421 04:04:54.195253 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da60fd2c6f29cb4caace3403a3a12301cfb4aeebfe55c5d917cfe11c580a219a" Apr 21 05:00:07.536917 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.536878 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kld45/must-gather-67fcs"] Apr 21 05:00:07.537356 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.537214 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abb42870-422c-4a56-81c9-f3064514ebb8" containerName="s3-tls-init-serving" Apr 21 05:00:07.537356 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.537226 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb42870-422c-4a56-81c9-f3064514ebb8" containerName="s3-tls-init-serving" Apr 21 05:00:07.537356 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.537277 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="abb42870-422c-4a56-81c9-f3064514ebb8" containerName="s3-tls-init-serving" Apr 21 05:00:07.540178 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.540161 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:07.542148 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.542124 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kld45\"/\"openshift-service-ca.crt\"" Apr 21 05:00:07.542271 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.542127 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kld45\"/\"kube-root-ca.crt\"" Apr 21 05:00:07.547647 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.547628 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kld45/must-gather-67fcs"] Apr 21 05:00:07.628689 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.628651 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420244a3-2042-457d-9072-0e4cbc0e7833-must-gather-output\") pod \"must-gather-67fcs\" (UID: \"420244a3-2042-457d-9072-0e4cbc0e7833\") " pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:07.628848 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.628718 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqsnh\" (UniqueName: \"kubernetes.io/projected/420244a3-2042-457d-9072-0e4cbc0e7833-kube-api-access-cqsnh\") pod \"must-gather-67fcs\" (UID: \"420244a3-2042-457d-9072-0e4cbc0e7833\") " pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:07.729816 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.729777 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqsnh\" (UniqueName: \"kubernetes.io/projected/420244a3-2042-457d-9072-0e4cbc0e7833-kube-api-access-cqsnh\") pod \"must-gather-67fcs\" (UID: \"420244a3-2042-457d-9072-0e4cbc0e7833\") " pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:07.729980 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.729826 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420244a3-2042-457d-9072-0e4cbc0e7833-must-gather-output\") pod \"must-gather-67fcs\" (UID: \"420244a3-2042-457d-9072-0e4cbc0e7833\") " pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:07.730158 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.730138 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420244a3-2042-457d-9072-0e4cbc0e7833-must-gather-output\") pod \"must-gather-67fcs\" (UID: \"420244a3-2042-457d-9072-0e4cbc0e7833\") " pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:07.738510 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.738487 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqsnh\" (UniqueName: \"kubernetes.io/projected/420244a3-2042-457d-9072-0e4cbc0e7833-kube-api-access-cqsnh\") pod \"must-gather-67fcs\" (UID: \"420244a3-2042-457d-9072-0e4cbc0e7833\") " pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:07.857784 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.857699 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:07.973594 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.973569 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kld45/must-gather-67fcs"] Apr 21 05:00:07.977762 ip-10-0-128-88 kubenswrapper[2580]: W0421 05:00:07.977730 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod420244a3_2042_457d_9072_0e4cbc0e7833.slice/crio-ae2c917977308d5afed34dd85c6a4da24d66d60a918317dc2cdec76da1fab38b WatchSource:0}: Error finding container ae2c917977308d5afed34dd85c6a4da24d66d60a918317dc2cdec76da1fab38b: Status 404 returned error can't find the container with id ae2c917977308d5afed34dd85c6a4da24d66d60a918317dc2cdec76da1fab38b Apr 21 05:00:07.979258 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:07.979241 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 05:00:08.574743 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:08.574658 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kld45/must-gather-67fcs" event={"ID":"420244a3-2042-457d-9072-0e4cbc0e7833","Type":"ContainerStarted","Data":"ae2c917977308d5afed34dd85c6a4da24d66d60a918317dc2cdec76da1fab38b"} Apr 21 05:00:12.589276 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:12.589234 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kld45/must-gather-67fcs" event={"ID":"420244a3-2042-457d-9072-0e4cbc0e7833","Type":"ContainerStarted","Data":"e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528"} Apr 21 05:00:12.589757 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:12.589283 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kld45/must-gather-67fcs" event={"ID":"420244a3-2042-457d-9072-0e4cbc0e7833","Type":"ContainerStarted","Data":"2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01"} Apr 21 05:00:12.602950 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:12.602894 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kld45/must-gather-67fcs" podStartSLOduration=1.484543795 podStartE2EDuration="5.602879434s" podCreationTimestamp="2026-04-21 05:00:07 +0000 UTC" firstStartedPulling="2026-04-21 05:00:07.979379281 +0000 UTC m=+3764.850035209" lastFinishedPulling="2026-04-21 05:00:12.09771492 +0000 UTC m=+3768.968370848" observedRunningTime="2026-04-21 05:00:12.602149506 +0000 UTC m=+3769.472805456" watchObservedRunningTime="2026-04-21 05:00:12.602879434 +0000 UTC m=+3769.473535384" Apr 21 05:00:33.656308 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:33.656270 2580 generic.go:358] "Generic (PLEG): container finished" podID="420244a3-2042-457d-9072-0e4cbc0e7833" containerID="2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01" exitCode=0 Apr 21 05:00:33.656706 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:33.656345 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kld45/must-gather-67fcs" event={"ID":"420244a3-2042-457d-9072-0e4cbc0e7833","Type":"ContainerDied","Data":"2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01"} Apr 21 05:00:33.656706 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:33.656639 2580 scope.go:117] "RemoveContainer" containerID="2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01" Apr 21 05:00:34.155481 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:34.155451 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kld45_must-gather-67fcs_420244a3-2042-457d-9072-0e4cbc0e7833/gather/0.log" Apr 21 05:00:37.779792 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:37.779759 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xb5d5_3cad3688-3f99-4785-ae07-f31cc486dd1d/global-pull-secret-syncer/0.log" Apr 21 05:00:37.872980 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:37.872938 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8dh6m_753c9f07-568c-4fcd-a6b8-25bada9bac1b/konnectivity-agent/0.log" Apr 21 05:00:37.931472 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:37.931448 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-88.ec2.internal_6fbea50e95cbda5ca95918851e7c31a8/haproxy/0.log" Apr 21 05:00:39.677205 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:39.677168 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kld45/must-gather-67fcs"] Apr 21 05:00:39.677719 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:39.677392 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-kld45/must-gather-67fcs" podUID="420244a3-2042-457d-9072-0e4cbc0e7833" containerName="copy" containerID="cri-o://e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528" gracePeriod=2 Apr 21 05:00:39.682684 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:39.682656 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kld45/must-gather-67fcs"] Apr 21 05:00:39.902785 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:39.902763 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kld45_must-gather-67fcs_420244a3-2042-457d-9072-0e4cbc0e7833/copy/0.log" Apr 21 05:00:39.903107 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:39.903092 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:40.000757 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.000729 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420244a3-2042-457d-9072-0e4cbc0e7833-must-gather-output\") pod \"420244a3-2042-457d-9072-0e4cbc0e7833\" (UID: \"420244a3-2042-457d-9072-0e4cbc0e7833\") " Apr 21 05:00:40.000911 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.000774 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqsnh\" (UniqueName: \"kubernetes.io/projected/420244a3-2042-457d-9072-0e4cbc0e7833-kube-api-access-cqsnh\") pod \"420244a3-2042-457d-9072-0e4cbc0e7833\" (UID: \"420244a3-2042-457d-9072-0e4cbc0e7833\") " Apr 21 05:00:40.002165 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.002138 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/420244a3-2042-457d-9072-0e4cbc0e7833-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "420244a3-2042-457d-9072-0e4cbc0e7833" (UID: "420244a3-2042-457d-9072-0e4cbc0e7833"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 05:00:40.002870 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.002847 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/420244a3-2042-457d-9072-0e4cbc0e7833-kube-api-access-cqsnh" (OuterVolumeSpecName: "kube-api-access-cqsnh") pod "420244a3-2042-457d-9072-0e4cbc0e7833" (UID: "420244a3-2042-457d-9072-0e4cbc0e7833"). InnerVolumeSpecName "kube-api-access-cqsnh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 05:00:40.101510 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.101480 2580 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/420244a3-2042-457d-9072-0e4cbc0e7833-must-gather-output\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 05:00:40.101510 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.101509 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqsnh\" (UniqueName: \"kubernetes.io/projected/420244a3-2042-457d-9072-0e4cbc0e7833-kube-api-access-cqsnh\") on node \"ip-10-0-128-88.ec2.internal\" DevicePath \"\"" Apr 21 05:00:40.678422 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.678392 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kld45_must-gather-67fcs_420244a3-2042-457d-9072-0e4cbc0e7833/copy/0.log" Apr 21 05:00:40.678829 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.678701 2580 generic.go:358] "Generic (PLEG): container finished" podID="420244a3-2042-457d-9072-0e4cbc0e7833" containerID="e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528" exitCode=143 Apr 21 05:00:40.678829 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.678746 2580 scope.go:117] "RemoveContainer" containerID="e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528" Apr 21 05:00:40.678829 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.678752 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kld45/must-gather-67fcs" Apr 21 05:00:40.687268 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.687137 2580 scope.go:117] "RemoveContainer" containerID="2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01" Apr 21 05:00:40.698497 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.698480 2580 scope.go:117] "RemoveContainer" containerID="e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528" Apr 21 05:00:40.698741 ip-10-0-128-88 kubenswrapper[2580]: E0421 05:00:40.698720 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528\": container with ID starting with e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528 not found: ID does not exist" containerID="e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528" Apr 21 05:00:40.698786 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.698750 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528"} err="failed to get container status \"e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528\": rpc error: code = NotFound desc = could not find container \"e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528\": container with ID starting with e57bbd5570b02d8c7cc40eb39e897a5dd001959e9fcfd0b439ba8e44a5754528 not found: ID does not exist" Apr 21 05:00:40.698786 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.698768 2580 scope.go:117] "RemoveContainer" containerID="2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01" Apr 21 05:00:40.699008 ip-10-0-128-88 kubenswrapper[2580]: E0421 05:00:40.698990 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01\": container with ID starting with 2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01 not found: ID does not exist" containerID="2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01" Apr 21 05:00:40.699049 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:40.699013 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01"} err="failed to get container status \"2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01\": rpc error: code = NotFound desc = could not find container \"2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01\": container with ID starting with 2da5fcbf7bcd7105b3aab9ab53bcb39c51de7ca28f0833ca207da15e6fe73b01 not found: ID does not exist" Apr 21 05:00:41.474144 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.474118 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jcf7w_0fef8221-7674-4fa6-849b-be7e4e2ffda0/kube-state-metrics/0.log" Apr 21 05:00:41.496794 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.496772 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jcf7w_0fef8221-7674-4fa6-849b-be7e4e2ffda0/kube-rbac-proxy-main/0.log" Apr 21 05:00:41.516139 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.516120 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jcf7w_0fef8221-7674-4fa6-849b-be7e4e2ffda0/kube-rbac-proxy-self/0.log" Apr 21 05:00:41.544804 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.544765 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6945946c54-md5kz_074fb1e8-3fa3-4376-9a76-b629900a63b2/metrics-server/0.log" Apr 21 05:00:41.571073 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.571052 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-xzsgw_d8ef523e-3eec-4098-936e-2664b106f3c3/monitoring-plugin/0.log" Apr 21 05:00:41.705142 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.705111 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="420244a3-2042-457d-9072-0e4cbc0e7833" path="/var/lib/kubelet/pods/420244a3-2042-457d-9072-0e4cbc0e7833/volumes" Apr 21 05:00:41.757630 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.757560 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zx8jh_80dc2c7a-c7d6-4faf-91e9-83b408f0ea18/node-exporter/0.log" Apr 21 05:00:41.778287 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.778259 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zx8jh_80dc2c7a-c7d6-4faf-91e9-83b408f0ea18/kube-rbac-proxy/0.log" Apr 21 05:00:41.799599 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.799577 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zx8jh_80dc2c7a-c7d6-4faf-91e9-83b408f0ea18/init-textfile/0.log" Apr 21 05:00:41.823884 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.823858 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2chdw_d97ba0ce-69e2-41b1-a18a-edc1ade7702c/kube-rbac-proxy-main/0.log" Apr 21 05:00:41.843329 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.843299 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2chdw_d97ba0ce-69e2-41b1-a18a-edc1ade7702c/kube-rbac-proxy-self/0.log" Apr 21 05:00:41.864339 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.864316 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2chdw_d97ba0ce-69e2-41b1-a18a-edc1ade7702c/openshift-state-metrics/0.log" Apr 21 05:00:41.916753 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.916720 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_05126879-231f-4438-a109-69892c129bfd/prometheus/0.log" Apr 21 05:00:41.936071 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.936042 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_05126879-231f-4438-a109-69892c129bfd/config-reloader/0.log" Apr 21 05:00:41.958733 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.958714 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_05126879-231f-4438-a109-69892c129bfd/thanos-sidecar/0.log" Apr 21 05:00:41.978657 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.978636 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_05126879-231f-4438-a109-69892c129bfd/kube-rbac-proxy-web/0.log" Apr 21 05:00:41.999655 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:41.999618 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_05126879-231f-4438-a109-69892c129bfd/kube-rbac-proxy/0.log" Apr 21 05:00:42.019411 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:42.019352 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_05126879-231f-4438-a109-69892c129bfd/kube-rbac-proxy-thanos/0.log" Apr 21 05:00:42.044327 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:42.044306 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_05126879-231f-4438-a109-69892c129bfd/init-config-reloader/0.log" Apr 21 05:00:44.299969 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.299929 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-hffk9_fa8f0c1f-765d-4933-a93e-b01b48713a92/download-server/0.log" Apr 21 05:00:44.883075 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.883039 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx"] Apr 21 05:00:44.883450 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.883434 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="420244a3-2042-457d-9072-0e4cbc0e7833" containerName="gather" Apr 21 05:00:44.883503 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.883452 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="420244a3-2042-457d-9072-0e4cbc0e7833" containerName="gather" Apr 21 05:00:44.883503 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.883470 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="420244a3-2042-457d-9072-0e4cbc0e7833" containerName="copy" Apr 21 05:00:44.883503 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.883475 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="420244a3-2042-457d-9072-0e4cbc0e7833" containerName="copy" Apr 21 05:00:44.883607 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.883524 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="420244a3-2042-457d-9072-0e4cbc0e7833" containerName="gather" Apr 21 05:00:44.883607 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.883531 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="420244a3-2042-457d-9072-0e4cbc0e7833" containerName="copy" Apr 21 05:00:44.888834 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.888813 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:44.890921 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.890896 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vqvxw\"/\"openshift-service-ca.crt\"" Apr 21 05:00:44.890921 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.890913 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vqvxw\"/\"kube-root-ca.crt\"" Apr 21 05:00:44.891338 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.891306 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vqvxw\"/\"default-dockercfg-vstpr\"" Apr 21 05:00:44.893194 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.893174 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx"] Apr 21 05:00:44.939394 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.939353 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-lib-modules\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:44.939564 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.939405 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-proc\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:44.939564 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.939495 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgq48\" (UniqueName: \"kubernetes.io/projected/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-kube-api-access-pgq48\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:44.939564 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.939541 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-sys\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:44.939564 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:44.939562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-podres\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.040199 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.040143 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-sys\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.040199 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.040204 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-podres\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.040437 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.040241 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-lib-modules\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.040437 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.040266 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-sys\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.040437 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.040279 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-proc\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.040437 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.040347 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-proc\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.040437 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.040368 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgq48\" (UniqueName: \"kubernetes.io/projected/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-kube-api-access-pgq48\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.040437 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.040378 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-podres\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.040437 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.040391 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-lib-modules\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.047995 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.047964 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgq48\" (UniqueName: \"kubernetes.io/projected/f8bed6b5-a366-49c0-a3f6-3ca4a4180982-kube-api-access-pgq48\") pod \"perf-node-gather-daemonset-p4sjx\" (UID: \"f8bed6b5-a366-49c0-a3f6-3ca4a4180982\") " pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.199697 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.199605 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.315022 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.314996 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx"] Apr 21 05:00:45.317548 ip-10-0-128-88 kubenswrapper[2580]: W0421 05:00:45.317512 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf8bed6b5_a366_49c0_a3f6_3ca4a4180982.slice/crio-ea5dce9f89637db3c879a6b96f4a485e606cadd88537b8196500bdc9c7aaa472 WatchSource:0}: Error finding container ea5dce9f89637db3c879a6b96f4a485e606cadd88537b8196500bdc9c7aaa472: Status 404 returned error can't find the container with id ea5dce9f89637db3c879a6b96f4a485e606cadd88537b8196500bdc9c7aaa472 Apr 21 05:00:45.395389 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.395362 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-29n5x_772f98a5-784d-4ae7-9617-a1b4f77424fb/dns/0.log" Apr 21 05:00:45.418093 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.418053 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-29n5x_772f98a5-784d-4ae7-9617-a1b4f77424fb/kube-rbac-proxy/0.log" Apr 21 05:00:45.574279 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.574249 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vjbp2_8db5c95c-dcdd-437d-bbd8-b52b4146dc61/dns-node-resolver/0.log" Apr 21 05:00:45.694513 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.694477 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" event={"ID":"f8bed6b5-a366-49c0-a3f6-3ca4a4180982","Type":"ContainerStarted","Data":"afeb0437767050fe429660831368062b76e6eb255cc6af8d3fe2e660ec813e31"} Apr 21 05:00:45.694513 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.694514 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" event={"ID":"f8bed6b5-a366-49c0-a3f6-3ca4a4180982","Type":"ContainerStarted","Data":"ea5dce9f89637db3c879a6b96f4a485e606cadd88537b8196500bdc9c7aaa472"} Apr 21 05:00:45.694730 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.694615 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:45.710616 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:45.710572 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" podStartSLOduration=1.7105555190000001 podStartE2EDuration="1.710555519s" podCreationTimestamp="2026-04-21 05:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 05:00:45.709012218 +0000 UTC m=+3802.579668168" watchObservedRunningTime="2026-04-21 05:00:45.710555519 +0000 UTC m=+3802.581211677" Apr 21 05:00:46.041701 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:46.041666 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lrfz2_aaab1022-57cd-4e71-8136-36d25cbe7fa1/node-ca/0.log" Apr 21 05:00:46.757589 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:46.757556 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-57d784794-br7xh_f4c8739a-c60c-42cd-bc9f-8648b4999008/router/0.log" Apr 21 05:00:47.108970 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:47.108888 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tj58v_c3592d56-3229-4a9f-8d19-2b45ed61d4c0/serve-healthcheck-canary/0.log" Apr 21 05:00:47.574740 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:47.574708 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jbmcx_2cb64835-0101-4b02-85c8-73487693689f/kube-rbac-proxy/0.log" Apr 21 05:00:47.593863 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:47.593829 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jbmcx_2cb64835-0101-4b02-85c8-73487693689f/exporter/0.log" Apr 21 05:00:47.615230 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:47.615199 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jbmcx_2cb64835-0101-4b02-85c8-73487693689f/extractor/0.log" Apr 21 05:00:49.912313 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:49.912275 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-hxhzt_d1e7cee8-08a0-478f-b275-84026323bd2f/s3-init/0.log" Apr 21 05:00:49.934355 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:49.934330 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-m6ghx_99b8381b-cc3a-4b62-bb24-d62b760ce5b0/s3-tls-init-custom/0.log" Apr 21 05:00:49.954756 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:49.954729 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-stjks_abb42870-422c-4a56-81c9-f3064514ebb8/s3-tls-init-serving/0.log" Apr 21 05:00:51.707967 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:51.707943 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vqvxw/perf-node-gather-daemonset-p4sjx" Apr 21 05:00:53.807924 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:53.807897 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zkx9f_22bf5a92-f511-48ff-86be-263716a64584/migrator/0.log" Apr 21 05:00:53.827753 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:53.827727 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zkx9f_22bf5a92-f511-48ff-86be-263716a64584/graceful-termination/0.log" Apr 21 05:00:55.109254 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.109225 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-29b2g_8fd542f8-4ff1-46b3-821d-17015eac9ffa/kube-multus/0.log" Apr 21 05:00:55.286673 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.286645 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjl4c_798393e0-1967-4ff8-bdbd-5debf844db1d/kube-multus-additional-cni-plugins/0.log" Apr 21 05:00:55.307278 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.307257 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjl4c_798393e0-1967-4ff8-bdbd-5debf844db1d/egress-router-binary-copy/0.log" Apr 21 05:00:55.326123 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.326099 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjl4c_798393e0-1967-4ff8-bdbd-5debf844db1d/cni-plugins/0.log" Apr 21 05:00:55.345437 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.345417 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjl4c_798393e0-1967-4ff8-bdbd-5debf844db1d/bond-cni-plugin/0.log" Apr 21 05:00:55.364149 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.364069 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjl4c_798393e0-1967-4ff8-bdbd-5debf844db1d/routeoverride-cni/0.log" Apr 21 05:00:55.382721 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.382701 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjl4c_798393e0-1967-4ff8-bdbd-5debf844db1d/whereabouts-cni-bincopy/0.log" Apr 21 05:00:55.403012 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.402991 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjl4c_798393e0-1967-4ff8-bdbd-5debf844db1d/whereabouts-cni/0.log" Apr 21 05:00:55.751460 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.751429 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x476t_8746933a-dcd1-407c-8ebf-6ce3af9d58c0/network-metrics-daemon/0.log" Apr 21 05:00:55.769360 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:55.769335 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x476t_8746933a-dcd1-407c-8ebf-6ce3af9d58c0/kube-rbac-proxy/0.log" Apr 21 05:00:56.780164 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:56.780133 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gl9hq_3a9c3508-2c05-4c97-851c-899383bc9ca7/ovn-controller/0.log" Apr 21 05:00:56.829290 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:56.829245 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gl9hq_3a9c3508-2c05-4c97-851c-899383bc9ca7/ovn-acl-logging/0.log" Apr 21 05:00:56.847619 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:56.847581 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gl9hq_3a9c3508-2c05-4c97-851c-899383bc9ca7/kube-rbac-proxy-node/0.log" Apr 21 05:00:56.866096 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:56.866054 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gl9hq_3a9c3508-2c05-4c97-851c-899383bc9ca7/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 05:00:56.884118 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:56.884059 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gl9hq_3a9c3508-2c05-4c97-851c-899383bc9ca7/northd/0.log" Apr 21 05:00:56.902923 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:56.902901 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gl9hq_3a9c3508-2c05-4c97-851c-899383bc9ca7/nbdb/0.log" Apr 21 05:00:56.921808 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:56.921786 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gl9hq_3a9c3508-2c05-4c97-851c-899383bc9ca7/sbdb/0.log" Apr 21 05:00:57.086858 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:57.086767 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gl9hq_3a9c3508-2c05-4c97-851c-899383bc9ca7/ovnkube-controller/0.log" Apr 21 05:00:58.348652 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:58.348629 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-f65gr_89ffc5c7-9bcc-4d22-ad71-079b7d40a5d1/network-check-target-container/0.log" Apr 21 05:00:59.258295 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:59.258264 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-m8ps6_3311248f-8a11-446e-8162-2933f3299d2d/iptables-alerter/0.log" Apr 21 05:00:59.852864 ip-10-0-128-88 kubenswrapper[2580]: I0421 05:00:59.852832 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bgswk_c7de1195-0825-476e-b0d2-fdf06e76e365/tuned/0.log"