Apr 22 15:08:15.781669 ip-10-0-141-188 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:08:16.164653 ip-10-0-141-188 kubenswrapper[2606]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:16.164653 ip-10-0-141-188 kubenswrapper[2606]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:08:16.164653 ip-10-0-141-188 kubenswrapper[2606]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:16.164653 ip-10-0-141-188 kubenswrapper[2606]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:08:16.164653 ip-10-0-141-188 kubenswrapper[2606]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:16.167815 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.167724 2606 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:08:16.171344 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171327 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.171344 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171344 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171348 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171351 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171354 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171370 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171373 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171377 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171380 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171383 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171386 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171389 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171392 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171400 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171403 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171406 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171411 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171415 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171418 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171421 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171424 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.171423 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171427 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171430 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171433 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171436 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171440 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171444 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171447 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171450 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171452 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171456 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171459 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171461 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171464 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171466 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171469 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171471 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171475 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171477 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171480 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.171901 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171482 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171485 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171487 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171490 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171492 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171495 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171497 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171500 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171502 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171504 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171507 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171510 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171512 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171514 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171517 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171521 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171524 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171527 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171529 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.172663 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171532 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171535 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171537 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171540 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171542 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171544 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171547 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171550 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171553 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171555 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171559 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171561 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171564 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171567 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171569 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171572 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171574 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171577 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171579 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.173223 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171582 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171585 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171588 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171590 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171593 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171595 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171598 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.171601 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173351 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173381 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173387 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173390 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173393 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173397 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173399 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173402 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173405 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173407 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173411 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173413 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.173706 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173416 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173419 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173421 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173424 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173427 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173430 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173434 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173438 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173441 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173444 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173448 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173451 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173454 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173456 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173459 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173462 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173465 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173469 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.174194 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173472 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173475 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173478 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173481 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173484 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173487 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173489 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173492 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173494 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173497 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173500 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173504 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173506 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173509 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173512 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173514 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173517 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173519 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173522 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173525 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.174651 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173527 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173530 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173532 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173535 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173537 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173540 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173543 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173545 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173548 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173550 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173553 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173556 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173558 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173561 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173563 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173568 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173570 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173573 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173576 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173580 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.175146 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173582 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173585 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173588 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173590 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173593 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173596 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173598 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173601 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173603 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173606 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173609 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173611 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173613 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173616 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173619 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.173622 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174437 2606 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174446 2606 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174454 2606 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174458 2606 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174463 2606 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174466 2606 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:08:16.175631 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174471 2606 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174475 2606 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174479 2606 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174482 2606 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174486 2606 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174489 2606 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174493 2606 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174496 2606 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174499 2606 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174502 2606 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174505 2606 flags.go:64] FLAG: --cloud-config="" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174507 2606 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174510 2606 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174515 2606 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174518 2606 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174522 2606 flags.go:64] FLAG: --config-dir="" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174525 2606 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174528 2606 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174538 2606 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174542 2606 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174545 2606 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174548 2606 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174551 2606 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174554 2606 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:08:16.176174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174557 2606 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174561 2606 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174563 2606 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174568 2606 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174571 2606 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174574 2606 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174577 2606 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174580 2606 flags.go:64] FLAG: --enable-server="true" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174583 2606 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174588 2606 flags.go:64] FLAG: --event-burst="100" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174591 2606 flags.go:64] FLAG: --event-qps="50" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174594 2606 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174597 2606 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174601 2606 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174605 2606 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174608 2606 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174611 2606 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174614 2606 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174617 2606 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174620 2606 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174623 2606 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174626 2606 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174629 2606 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174632 2606 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174634 2606 flags.go:64] FLAG: --feature-gates="" Apr 22 15:08:16.176866 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174638 2606 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174641 2606 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174645 2606 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174648 2606 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174651 2606 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174655 2606 flags.go:64] FLAG: --help="false" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174658 2606 flags.go:64] FLAG: --hostname-override="ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174661 2606 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174664 2606 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174667 2606 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174671 2606 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174674 2606 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174677 2606 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174680 2606 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174683 2606 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174686 2606 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174689 2606 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174692 2606 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174695 2606 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174698 2606 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174701 2606 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174705 2606 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174708 2606 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174711 2606 flags.go:64] FLAG: --lock-file="" Apr 22 15:08:16.177480 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174714 2606 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174717 2606 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174720 2606 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174725 2606 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174728 2606 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174731 2606 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174734 2606 flags.go:64] FLAG: --logging-format="text" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174737 2606 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174741 2606 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174744 2606 flags.go:64] FLAG: --manifest-url="" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174747 2606 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174752 2606 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174755 2606 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174760 2606 flags.go:64] FLAG: --max-pods="110" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174763 2606 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174766 2606 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174769 2606 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174772 2606 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174775 2606 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174778 2606 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174781 2606 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174790 2606 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174793 2606 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174796 2606 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:08:16.178101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174800 2606 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174802 2606 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174809 2606 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174812 2606 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174816 2606 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174819 2606 flags.go:64] FLAG: --port="10250" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174822 2606 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174825 2606 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c4d57f0918ab4f6b" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174828 2606 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174831 2606 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174833 2606 flags.go:64] FLAG: --register-node="true" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174836 2606 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174839 2606 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174843 2606 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174846 2606 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174849 2606 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174852 2606 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174855 2606 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174858 2606 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174861 2606 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174864 2606 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174867 2606 flags.go:64] FLAG: --runonce="false" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174874 2606 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174877 2606 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174880 2606 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:08:16.178692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174883 2606 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174886 2606 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174889 2606 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174892 2606 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174895 2606 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174898 2606 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174902 2606 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174904 2606 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174908 2606 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174911 2606 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174914 2606 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174917 2606 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174923 2606 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174926 2606 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174928 2606 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174933 2606 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174936 2606 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174939 2606 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174942 2606 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174945 2606 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174948 2606 flags.go:64] FLAG: --v="2" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174952 2606 flags.go:64] FLAG: --version="false" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174956 2606 flags.go:64] FLAG: --vmodule="" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174960 2606 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.174963 2606 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:08:16.179304 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175070 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175074 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175077 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175080 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175085 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175088 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175091 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175095 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175097 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175102 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175106 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175109 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175112 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175116 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175119 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175121 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175124 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175127 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175130 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.179936 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175132 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175135 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175138 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175141 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175143 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175146 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175149 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175152 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175154 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175157 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175160 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175163 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175165 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175168 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175170 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175173 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175175 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175180 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175183 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175186 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.180413 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175188 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175191 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175193 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175196 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175199 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175201 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175204 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175206 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175209 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175211 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175214 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175216 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175219 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175221 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175224 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175227 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175229 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175231 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175234 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175236 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.180930 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175239 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175241 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175244 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175246 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175249 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175251 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175254 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175256 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175260 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175265 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175269 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175272 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175274 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175277 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175282 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175284 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175287 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175289 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175292 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.181762 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175294 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.182599 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175297 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.182599 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175300 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.182599 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175302 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.182599 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175305 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.182599 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175308 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.182599 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175310 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.182599 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.175313 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.182599 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.175878 2606 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:16.183084 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.183062 2606 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:08:16.183150 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.183086 2606 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:08:16.183198 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183163 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.183198 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183172 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.183198 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183177 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.183198 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183182 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.183198 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183186 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.183198 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183191 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.183198 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183196 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.183198 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183200 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183205 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183209 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183213 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183218 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183222 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183226 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183230 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183234 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183238 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183243 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183247 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183252 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183256 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183260 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183264 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183269 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183273 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183277 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183282 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.183593 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183286 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183291 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183295 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183299 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183304 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183309 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183316 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183323 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183328 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183333 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183337 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183341 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183345 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183349 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183354 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183374 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183379 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183384 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183388 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183392 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.184432 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183397 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183402 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183406 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183410 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183414 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183418 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183422 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183426 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183431 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183435 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183440 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183444 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183448 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183452 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183457 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183461 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183465 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183476 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183482 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.184979 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183488 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183492 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183497 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183501 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183505 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183509 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183514 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183519 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183523 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183527 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183532 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183536 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183540 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183544 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183548 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183552 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183556 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183561 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183565 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.185504 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183569 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.183578 2606 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183740 2606 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183749 2606 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183754 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183758 2606 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183763 2606 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183768 2606 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183773 2606 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183777 2606 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183781 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183786 2606 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183791 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183796 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183800 2606 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:16.186212 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183805 2606 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183809 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183813 2606 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183817 2606 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183821 2606 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183826 2606 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183830 2606 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183835 2606 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183839 2606 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183843 2606 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183848 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183852 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183857 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183861 2606 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183865 2606 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183869 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183873 2606 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183877 2606 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183881 2606 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183885 2606 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:16.186655 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183890 2606 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183894 2606 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183898 2606 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183903 2606 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183907 2606 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183911 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183916 2606 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183920 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183924 2606 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183929 2606 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183934 2606 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183941 2606 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183947 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183952 2606 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183957 2606 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183961 2606 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183966 2606 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183971 2606 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183975 2606 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:16.187178 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183980 2606 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183986 2606 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183991 2606 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.183996 2606 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184000 2606 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184004 2606 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184008 2606 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184012 2606 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184016 2606 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184020 2606 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184025 2606 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184029 2606 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184034 2606 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184038 2606 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184042 2606 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184046 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184050 2606 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184055 2606 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184059 2606 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:16.187832 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184063 2606 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184067 2606 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184071 2606 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184076 2606 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184080 2606 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184084 2606 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184088 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184093 2606 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184097 2606 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184101 2606 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184106 2606 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184110 2606 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184114 2606 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184118 2606 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:16.184122 2606 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.184130 2606 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:16.188619 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.184879 2606 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:08:16.189282 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.189266 2606 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:08:16.190158 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.190144 2606 server.go:1019] "Starting client certificate rotation" Apr 22 15:08:16.190258 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.190240 2606 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:08:16.190295 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.190280 2606 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:08:16.213401 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.213355 2606 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:08:16.217154 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.217123 2606 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:08:16.232333 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.232183 2606 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:08:16.237805 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.237789 2606 log.go:25] "Validated CRI v1 image API" Apr 22 15:08:16.239109 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.239095 2606 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:08:16.241314 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.241293 2606 fs.go:135] Filesystem UUIDs: map[561ebb27-b95a-4e73-ba05-174fb39694bb:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9b043865-6b47-4ff1-991c-d31173ff438b:/dev/nvme0n1p4] Apr 22 15:08:16.241394 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.241313 2606 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:08:16.246845 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.246817 2606 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:08:16.247062 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.246941 2606 manager.go:217] Machine: {Timestamp:2026-04-22 15:08:16.245207939 +0000 UTC m=+0.359156534 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099515 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec230002a782ec6920be39e9a4fbf02e SystemUUID:ec230002-a782-ec69-20be-39e9a4fbf02e BootID:52bf8712-65dd-4d9f-ad32-541f10fa7bb8 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5d:72:27:82:b3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5d:72:27:82:b3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:4a:3e:ad:8e:ec Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:08:16.247062 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.247058 2606 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:08:16.247212 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.247183 2606 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:08:16.247592 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.247564 2606 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:08:16.247759 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.247592 2606 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-188.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:08:16.247836 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.247773 2606 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:08:16.247836 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.247784 2606 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:08:16.247836 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.247803 2606 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:08:16.248471 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.248450 2606 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:08:16.249854 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.249842 2606 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:08:16.249978 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.249968 2606 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:08:16.252489 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.252478 2606 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:08:16.252556 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.252496 2606 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:08:16.252556 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.252513 2606 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:08:16.252556 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.252528 2606 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:08:16.252556 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.252541 2606 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:08:16.253638 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.253624 2606 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:08:16.253704 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.253651 2606 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:08:16.256776 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.256761 2606 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:08:16.258569 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.258553 2606 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:08:16.260275 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260261 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:08:16.260378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260282 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:08:16.260378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260291 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:08:16.260378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260300 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:08:16.260378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260309 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:08:16.260378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260317 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:08:16.260378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260326 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:08:16.260378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260335 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:08:16.260378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260347 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:08:16.260378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260371 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:08:16.261055 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260395 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:08:16.261055 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.260409 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:08:16.261315 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.261303 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:08:16.261382 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.261320 2606 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:08:16.263909 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.263878 2606 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-188.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 15:08:16.263909 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.263891 2606 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 15:08:16.265169 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.265154 2606 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:08:16.265246 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.265197 2606 server.go:1295] "Started kubelet" Apr 22 15:08:16.265307 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.265250 2606 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:08:16.265399 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.265336 2606 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:08:16.265448 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.265422 2606 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:08:16.266108 ip-10-0-141-188 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:08:16.266983 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.266951 2606 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:08:16.268201 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.268184 2606 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:08:16.278068 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.278046 2606 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-188.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 15:08:16.278645 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.278626 2606 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 15:08:16.279067 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.278110 2606 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-188.ec2.internal.18a8b64d55c1e004 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-188.ec2.internal,UID:ip-10-0-141-188.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-188.ec2.internal,},FirstTimestamp:2026-04-22 15:08:16.265166852 +0000 UTC m=+0.379115450,LastTimestamp:2026-04-22 15:08:16.265166852 +0000 UTC m=+0.379115450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-188.ec2.internal,}" Apr 22 15:08:16.282326 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.282310 2606 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-59rbg" Apr 22 15:08:16.283052 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.283034 2606 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:08:16.283123 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.283040 2606 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:08:16.283783 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.283761 2606 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:08:16.283935 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.283921 2606 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:08:16.284018 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.283995 2606 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:08:16.284075 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.283925 2606 factory.go:55] Registering systemd factory Apr 22 15:08:16.284123 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.284100 2606 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:08:16.284169 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.284124 2606 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:08:16.284169 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.284133 2606 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:08:16.284312 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.284294 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:16.284409 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.284334 2606 factory.go:153] Registering CRI-O factory Apr 22 15:08:16.284409 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.284349 2606 factory.go:223] Registration of the crio container factory successfully Apr 22 15:08:16.284593 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.284439 2606 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:08:16.284593 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.284477 2606 factory.go:103] Registering Raw factory Apr 22 15:08:16.284593 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.284492 2606 manager.go:1196] Started watching for new ooms in manager Apr 22 15:08:16.285070 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.285043 2606 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-188.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 15:08:16.285153 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.285088 2606 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 15:08:16.285674 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.285661 2606 manager.go:319] Starting recovery of all containers Apr 22 15:08:16.291018 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.290997 2606 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-59rbg" Apr 22 15:08:16.294531 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.294514 2606 manager.go:324] Recovery completed Apr 22 15:08:16.298981 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.298965 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:16.309560 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.309544 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:16.309646 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.309574 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:16.309646 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.309584 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:16.310182 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.310166 2606 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:08:16.310182 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.310179 2606 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:08:16.310292 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.310196 2606 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:08:16.313713 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.313701 2606 policy_none.go:49] "None policy: Start" Apr 22 15:08:16.313759 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.313717 2606 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:08:16.313759 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.313726 2606 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:08:16.349928 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.349910 2606 manager.go:341] "Starting Device Plugin manager" Apr 22 15:08:16.366575 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.349944 2606 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:08:16.366575 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.349955 2606 server.go:85] "Starting device plugin registration server" Apr 22 15:08:16.366575 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.350197 2606 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:08:16.366575 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.350208 2606 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:08:16.366575 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.350293 2606 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:08:16.366575 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.350351 2606 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:08:16.366575 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.350375 2606 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:08:16.366575 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.351574 2606 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:08:16.366575 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.351608 2606 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:16.410320 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.410290 2606 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:08:16.411415 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.411398 2606 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:08:16.411415 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.411421 2606 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:08:16.411589 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.411471 2606 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:08:16.411589 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.411481 2606 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:08:16.411589 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.411519 2606 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:08:16.414825 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.414777 2606 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:16.450447 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.450425 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:16.451540 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.451522 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:16.451633 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.451555 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:16.451633 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.451565 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:16.451633 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.451590 2606 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.457814 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.457799 2606 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.457868 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.457822 2606 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-188.ec2.internal\": node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:16.474836 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.474814 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:16.511662 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.511640 2606 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal"] Apr 22 15:08:16.511759 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.511713 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:16.512666 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.512647 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:16.512757 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.512679 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:16.512757 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.512694 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:16.515102 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.515089 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:16.515276 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.515263 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.515320 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.515294 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:16.515823 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.515808 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:16.515890 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.515838 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:16.515890 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.515850 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:16.515953 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.515807 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:16.515953 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.515917 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:16.515953 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.515929 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:16.518179 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.518162 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.518273 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.518186 2606 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:16.518833 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.518819 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:16.518907 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.518842 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:16.518907 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.518854 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:16.540907 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.540885 2606 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-188.ec2.internal\" not found" node="ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.545228 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.545212 2606 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-188.ec2.internal\" not found" node="ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.574886 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.574863 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:16.585208 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.585188 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/af2477b9db2ffbcd2bd186aed8c8adcf-config\") pod \"kube-apiserver-proxy-ip-10-0-141-188.ec2.internal\" (UID: \"af2477b9db2ffbcd2bd186aed8c8adcf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.585261 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.585214 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9d6856789bf25ffb38c205135a36931c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal\" (UID: \"9d6856789bf25ffb38c205135a36931c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.585261 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.585235 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d6856789bf25ffb38c205135a36931c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal\" (UID: \"9d6856789bf25ffb38c205135a36931c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.675890 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.675816 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:16.686168 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.686147 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9d6856789bf25ffb38c205135a36931c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal\" (UID: \"9d6856789bf25ffb38c205135a36931c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.686224 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.686173 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d6856789bf25ffb38c205135a36931c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal\" (UID: \"9d6856789bf25ffb38c205135a36931c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.686224 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.686191 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/af2477b9db2ffbcd2bd186aed8c8adcf-config\") pod \"kube-apiserver-proxy-ip-10-0-141-188.ec2.internal\" (UID: \"af2477b9db2ffbcd2bd186aed8c8adcf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.686288 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.686241 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/af2477b9db2ffbcd2bd186aed8c8adcf-config\") pod \"kube-apiserver-proxy-ip-10-0-141-188.ec2.internal\" (UID: \"af2477b9db2ffbcd2bd186aed8c8adcf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.686288 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.686243 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9d6856789bf25ffb38c205135a36931c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal\" (UID: \"9d6856789bf25ffb38c205135a36931c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.686288 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.686262 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d6856789bf25ffb38c205135a36931c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal\" (UID: \"9d6856789bf25ffb38c205135a36931c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.776614 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.776568 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:16.843134 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.843102 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.847812 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:16.847790 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal" Apr 22 15:08:16.877086 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.877055 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:16.977668 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:16.977572 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:17.078162 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:17.078133 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:17.166228 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.166192 2606 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:17.178338 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:17.178318 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:17.189841 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.189806 2606 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:08:17.189965 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.189946 2606 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:08:17.190011 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.189972 2606 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:08:17.278441 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:17.278411 2606 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-188.ec2.internal\" not found" Apr 22 15:08:17.283810 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.283789 2606 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:08:17.293808 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.293777 2606 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:03:16 +0000 UTC" deadline="2027-10-03 00:01:57.315029364 +0000 UTC" Apr 22 15:08:17.293897 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.293812 2606 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12680h53m40.021228272s" Apr 22 15:08:17.306326 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.306302 2606 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:08:17.354118 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.354092 2606 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:17.383525 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.383496 2606 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal" Apr 22 15:08:17.383673 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.383582 2606 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hmsk6" Apr 22 15:08:17.392557 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.392537 2606 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hmsk6" Apr 22 15:08:17.396352 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.396338 2606 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:08:17.397816 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.397798 2606 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" Apr 22 15:08:17.414544 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.414522 2606 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:08:17.441310 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:17.441134 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d6856789bf25ffb38c205135a36931c.slice/crio-6ddef11987b050a45db38c481c7729fd12cd38d5a332e20f834d9cfb489d786b WatchSource:0}: Error finding container 6ddef11987b050a45db38c481c7729fd12cd38d5a332e20f834d9cfb489d786b: Status 404 returned error can't find the container with id 6ddef11987b050a45db38c481c7729fd12cd38d5a332e20f834d9cfb489d786b Apr 22 15:08:17.441660 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:17.441639 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf2477b9db2ffbcd2bd186aed8c8adcf.slice/crio-ce755051b4939cf8ef00fccb7dc5ad6aa5aaad6fad841a5fba028d1cf05c5582 WatchSource:0}: Error finding container ce755051b4939cf8ef00fccb7dc5ad6aa5aaad6fad841a5fba028d1cf05c5582: Status 404 returned error can't find the container with id ce755051b4939cf8ef00fccb7dc5ad6aa5aaad6fad841a5fba028d1cf05c5582 Apr 22 15:08:17.446097 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.446079 2606 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:08:17.454597 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:17.454577 2606 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:18.253473 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.253435 2606 apiserver.go:52] "Watching apiserver" Apr 22 15:08:18.261394 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.261349 2606 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:08:18.263227 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.263197 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9wsxv","openshift-image-registry/node-ca-2hlwz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal","openshift-multus/multus-additional-cni-plugins-zcx5t","openshift-cluster-node-tuning-operator/tuned-w9rvd","openshift-multus/multus-zlnwc","openshift-multus/network-metrics-daemon-75v74","openshift-network-diagnostics/network-check-target-w6z28","openshift-network-operator/iptables-alerter-f47hr","openshift-ovn-kubernetes/ovnkube-node-ck8h2","kube-system/global-pull-secret-syncer-qrz9r","kube-system/konnectivity-agent-s2b7k","kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6"] Apr 22 15:08:18.266379 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.266344 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:18.266496 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.266450 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:18.268746 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.268594 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:18.271099 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.270899 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.271880 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.271668 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7qqm6\"" Apr 22 15:08:18.271880 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.271698 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:08:18.272326 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.272295 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:08:18.273417 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.273341 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:08:18.273542 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.273504 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.273911 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.273787 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.273911 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.273813 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-6mf7c\"" Apr 22 15:08:18.275648 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.275628 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.276102 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.276023 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.278044 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.278024 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.278441 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.278422 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.278524 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.278465 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.278524 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.278424 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:08:18.279497 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.279057 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:08:18.279497 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.279075 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:08:18.279497 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.279093 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zdh5s\"" Apr 22 15:08:18.279497 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.279158 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qjp2l\"" Apr 22 15:08:18.279497 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.279191 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.279497 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.279490 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.281154 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.281135 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:08:18.281847 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.281815 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.282199 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.282163 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-k5vfw\"" Apr 22 15:08:18.284037 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.284016 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:08:18.284135 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.284061 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.285916 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.284848 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.285916 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.284848 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kzl22\"" Apr 22 15:08:18.290702 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.290676 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.293881 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.293580 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.293881 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.293732 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rrf8w\"" Apr 22 15:08:18.293881 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.293787 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.294097 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.293920 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294319 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-run-k8s-cni-cncf-io\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294397 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294425 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqm6n\" (UniqueName: \"kubernetes.io/projected/002b7b51-3bad-42f0-b2ba-e4180da5923c-kube-api-access-mqm6n\") pod \"node-resolver-9wsxv\" (UID: \"002b7b51-3bad-42f0-b2ba-e4180da5923c\") " pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294449 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6e2a4fe1-ffb7-4e8e-8401-5dae97434c83-agent-certs\") pod \"konnectivity-agent-s2b7k\" (UID: \"6e2a4fe1-ffb7-4e8e-8401-5dae97434c83\") " pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294475 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-cnibin\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294493 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-multus-socket-dir-parent\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294513 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-run-netns\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294541 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294607 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-modprobe-d\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294637 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-host\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294667 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4cf417d6-b90e-4570-8262-b67044850c51-multus-daemon-config\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294692 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-etc-kubernetes\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294735 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294717 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96k2g\" (UniqueName: \"kubernetes.io/projected/4cf417d6-b90e-4570-8262-b67044850c51-kube-api-access-96k2g\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294770 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-registration-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294798 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.294956 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294837 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/002b7b51-3bad-42f0-b2ba-e4180da5923c-hosts-file\") pod \"node-resolver-9wsxv\" (UID: \"002b7b51-3bad-42f0-b2ba-e4180da5923c\") " pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294903 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-cnibin\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294946 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/807f61ad-b285-4b1d-b001-650d8ea8f622-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.294978 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-multus-conf-dir\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295004 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-run-multus-certs\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295030 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-var-lib-cni-multus\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295056 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-sysctl-conf\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295080 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-lib-modules\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295102 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4158dca8-c8a1-478c-92e2-6eff8a81fd54-iptables-alerter-script\") pod \"iptables-alerter-f47hr\" (UID: \"4158dca8-c8a1-478c-92e2-6eff8a81fd54\") " pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295125 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-system-cni-dir\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295164 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/807f61ad-b285-4b1d-b001-650d8ea8f622-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295237 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/002b7b51-3bad-42f0-b2ba-e4180da5923c-tmp-dir\") pod \"node-resolver-9wsxv\" (UID: \"002b7b51-3bad-42f0-b2ba-e4180da5923c\") " pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295307 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-kubernetes\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295339 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-system-cni-dir\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295378 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-hostroot\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295403 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-os-release\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.296033 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295427 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-run\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295450 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-sys\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295471 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/723a6368-c49b-450c-9575-bc16b7c8b86f-tmp\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295494 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psvcp\" (UniqueName: \"kubernetes.io/projected/723a6368-c49b-450c-9575-bc16b7c8b86f-kube-api-access-psvcp\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295518 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4158dca8-c8a1-478c-92e2-6eff8a81fd54-host-slash\") pod \"iptables-alerter-f47hr\" (UID: \"4158dca8-c8a1-478c-92e2-6eff8a81fd54\") " pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295541 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-socket-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295566 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8lvt\" (UniqueName: \"kubernetes.io/projected/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-kube-api-access-k8lvt\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295588 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-var-lib-kubelet\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295612 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-tuned\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295634 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4cf417d6-b90e-4570-8262-b67044850c51-cni-binary-copy\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295674 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-device-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295718 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-systemd\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295741 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-os-release\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295773 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48gxm\" (UniqueName: \"kubernetes.io/projected/807f61ad-b285-4b1d-b001-650d8ea8f622-kube-api-access-48gxm\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295802 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-sysconfig\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295851 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-multus-cni-dir\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.296849 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295877 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-var-lib-kubelet\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295903 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nqrk\" (UniqueName: \"kubernetes.io/projected/7069128e-a7fb-43e9-a858-e8e3250b2ac0-kube-api-access-2nqrk\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295927 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-sysctl-d\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295949 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-var-lib-cni-bin\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295975 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6e2a4fe1-ffb7-4e8e-8401-5dae97434c83-konnectivity-ca\") pod \"konnectivity-agent-s2b7k\" (UID: \"6e2a4fe1-ffb7-4e8e-8401-5dae97434c83\") " pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.295998 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5t9\" (UniqueName: \"kubernetes.io/projected/4158dca8-c8a1-478c-92e2-6eff8a81fd54-kube-api-access-6x5t9\") pod \"iptables-alerter-f47hr\" (UID: \"4158dca8-c8a1-478c-92e2-6eff8a81fd54\") " pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.296077 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.296098 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-etc-selinux\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.296123 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-sys-fs\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.296149 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/807f61ad-b285-4b1d-b001-650d8ea8f622-cni-binary-copy\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.296779 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.296840 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.296921 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.297258 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.297708 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.297585 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:08:18.298331 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.298022 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:08:18.298828 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.298757 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4dwgj\"" Apr 22 15:08:18.298922 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.298838 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:08:18.299315 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.299291 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:18.299315 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.299306 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.299518 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.299355 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:18.301618 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.301600 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vrq7v\"" Apr 22 15:08:18.302250 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.302046 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:08:18.302581 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.302567 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:08:18.386260 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.385754 2606 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:08:18.393616 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.393579 2606 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:03:17 +0000 UTC" deadline="2028-01-24 02:49:53.522578198 +0000 UTC" Apr 22 15:08:18.393616 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.393604 2606 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15395h41m35.128976088s" Apr 22 15:08:18.396320 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396289 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nqrk\" (UniqueName: \"kubernetes.io/projected/7069128e-a7fb-43e9-a858-e8e3250b2ac0-kube-api-access-2nqrk\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:18.396461 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396330 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-sysctl-d\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.396461 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396354 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-var-lib-cni-bin\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.396461 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396398 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-run-ovn\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.396461 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396423 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.396461 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396449 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-var-lib-cni-bin\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.396727 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396450 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9e225c1-9713-4720-8357-aaf7078a9c2d-env-overrides\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.396727 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396535 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6e2a4fe1-ffb7-4e8e-8401-5dae97434c83-konnectivity-ca\") pod \"konnectivity-agent-s2b7k\" (UID: \"6e2a4fe1-ffb7-4e8e-8401-5dae97434c83\") " pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:18.396727 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396600 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-sysctl-d\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.396727 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396599 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5t9\" (UniqueName: \"kubernetes.io/projected/4158dca8-c8a1-478c-92e2-6eff8a81fd54-kube-api-access-6x5t9\") pod \"iptables-alerter-f47hr\" (UID: \"4158dca8-c8a1-478c-92e2-6eff8a81fd54\") " pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.396727 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396675 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9e225c1-9713-4720-8357-aaf7078a9c2d-ovn-node-metrics-cert\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.396727 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396723 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-etc-selinux\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396753 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-sys-fs\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396779 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/807f61ad-b285-4b1d-b001-650d8ea8f622-cni-binary-copy\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396805 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-run-k8s-cni-cncf-io\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396830 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-etc-selinux\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396833 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-kubelet\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396855 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-sys-fs\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396879 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396889 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-run-k8s-cni-cncf-io\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396899 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396926 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqm6n\" (UniqueName: \"kubernetes.io/projected/002b7b51-3bad-42f0-b2ba-e4180da5923c-kube-api-access-mqm6n\") pod \"node-resolver-9wsxv\" (UID: \"002b7b51-3bad-42f0-b2ba-e4180da5923c\") " pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.396949 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6e2a4fe1-ffb7-4e8e-8401-5dae97434c83-agent-certs\") pod \"konnectivity-agent-s2b7k\" (UID: \"6e2a4fe1-ffb7-4e8e-8401-5dae97434c83\") " pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:18.397006 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397000 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397069 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-cnibin\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397100 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-multus-socket-dir-parent\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397126 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-run-netns\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397152 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d3bf255c-11af-482e-b25f-be50be214aed-serviceca\") pod \"node-ca-2hlwz\" (UID: \"d3bf255c-11af-482e-b25f-be50be214aed\") " pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397168 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6e2a4fe1-ffb7-4e8e-8401-5dae97434c83-konnectivity-ca\") pod \"konnectivity-agent-s2b7k\" (UID: \"6e2a4fe1-ffb7-4e8e-8401-5dae97434c83\") " pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397177 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397202 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-modprobe-d\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397205 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-run-netns\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397176 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-cnibin\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397227 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxkfv\" (UniqueName: \"kubernetes.io/projected/d3bf255c-11af-482e-b25f-be50be214aed-kube-api-access-sxkfv\") pod \"node-ca-2hlwz\" (UID: \"d3bf255c-11af-482e-b25f-be50be214aed\") " pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397243 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-multus-socket-dir-parent\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397255 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-run-systemd\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397278 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-cni-netd\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.397284 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397294 2606 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397329 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-modprobe-d\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397302 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-host\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.397685 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397348 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-host\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.397377 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs podName:7069128e-a7fb-43e9-a858-e8e3250b2ac0 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:18.897329097 +0000 UTC m=+3.011277690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs") pod "network-metrics-daemon-75v74" (UID: "7069128e-a7fb-43e9-a858-e8e3250b2ac0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397407 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4cf417d6-b90e-4570-8262-b67044850c51-multus-daemon-config\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397432 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-etc-kubernetes\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397454 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96k2g\" (UniqueName: \"kubernetes.io/projected/4cf417d6-b90e-4570-8262-b67044850c51-kube-api-access-96k2g\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397478 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3bf255c-11af-482e-b25f-be50be214aed-host\") pod \"node-ca-2hlwz\" (UID: \"d3bf255c-11af-482e-b25f-be50be214aed\") " pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397404 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/807f61ad-b285-4b1d-b001-650d8ea8f622-cni-binary-copy\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397501 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-slash\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397530 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-registration-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397532 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-etc-kubernetes\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397556 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397693 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-registration-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397723 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/002b7b51-3bad-42f0-b2ba-e4180da5923c-hosts-file\") pod \"node-resolver-9wsxv\" (UID: \"002b7b51-3bad-42f0-b2ba-e4180da5923c\") " pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397746 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397749 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-systemd-units\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397795 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/002b7b51-3bad-42f0-b2ba-e4180da5923c-hosts-file\") pod \"node-resolver-9wsxv\" (UID: \"002b7b51-3bad-42f0-b2ba-e4180da5923c\") " pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.398541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397797 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-etc-openvswitch\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397828 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-cnibin\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397853 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/807f61ad-b285-4b1d-b001-650d8ea8f622-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397884 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-multus-conf-dir\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397882 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-cnibin\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397944 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-multus-conf-dir\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397951 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4cf417d6-b90e-4570-8262-b67044850c51-multus-daemon-config\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397944 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-run-multus-certs\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.397982 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-run-multus-certs\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398011 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-cni-bin\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398042 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-var-lib-cni-multus\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398073 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-log-socket\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398099 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398129 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-sysctl-conf\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398128 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-var-lib-cni-multus\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398169 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-lib-modules\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398242 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9e225c1-9713-4720-8357-aaf7078a9c2d-ovnkube-config\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.399377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398273 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4158dca8-c8a1-478c-92e2-6eff8a81fd54-iptables-alerter-script\") pod \"iptables-alerter-f47hr\" (UID: \"4158dca8-c8a1-478c-92e2-6eff8a81fd54\") " pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398309 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/807f61ad-b285-4b1d-b001-650d8ea8f622-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398275 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-sysctl-conf\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398312 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-dbus\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398281 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-lib-modules\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398356 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-system-cni-dir\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398396 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/807f61ad-b285-4b1d-b001-650d8ea8f622-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398411 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-system-cni-dir\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398426 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/002b7b51-3bad-42f0-b2ba-e4180da5923c-tmp-dir\") pod \"node-resolver-9wsxv\" (UID: \"002b7b51-3bad-42f0-b2ba-e4180da5923c\") " pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398451 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-kubernetes\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398476 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-system-cni-dir\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398498 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-hostroot\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398522 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-os-release\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398543 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-run\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398567 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-sys\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398575 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-hostroot\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398590 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/723a6368-c49b-450c-9575-bc16b7c8b86f-tmp\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400116 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398617 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psvcp\" (UniqueName: \"kubernetes.io/projected/723a6368-c49b-450c-9575-bc16b7c8b86f-kube-api-access-psvcp\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398626 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-system-cni-dir\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398542 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-kubernetes\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398641 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4158dca8-c8a1-478c-92e2-6eff8a81fd54-host-slash\") pod \"iptables-alerter-f47hr\" (UID: \"4158dca8-c8a1-478c-92e2-6eff8a81fd54\") " pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398674 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-run-openvswitch\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398697 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/807f61ad-b285-4b1d-b001-650d8ea8f622-os-release\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398706 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-run-ovn-kubernetes\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398712 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/002b7b51-3bad-42f0-b2ba-e4180da5923c-tmp-dir\") pod \"node-resolver-9wsxv\" (UID: \"002b7b51-3bad-42f0-b2ba-e4180da5923c\") " pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398735 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-socket-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398752 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-run\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398766 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8lvt\" (UniqueName: \"kubernetes.io/projected/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-kube-api-access-k8lvt\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398794 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-var-lib-kubelet\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398810 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-sys\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398818 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-tuned\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398843 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4cf417d6-b90e-4570-8262-b67044850c51-cni-binary-copy\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398845 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/807f61ad-b285-4b1d-b001-650d8ea8f622-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398870 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4158dca8-c8a1-478c-92e2-6eff8a81fd54-host-slash\") pod \"iptables-alerter-f47hr\" (UID: \"4158dca8-c8a1-478c-92e2-6eff8a81fd54\") " pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.400793 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.398873 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-run-netns\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399071 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-var-lib-kubelet\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399162 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-node-log\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399219 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9e225c1-9713-4720-8357-aaf7078a9c2d-ovnkube-script-lib\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399273 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-device-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399327 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-device-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399350 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-systemd\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399393 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4158dca8-c8a1-478c-92e2-6eff8a81fd54-iptables-alerter-script\") pod \"iptables-alerter-f47hr\" (UID: \"4158dca8-c8a1-478c-92e2-6eff8a81fd54\") " pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399414 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-os-release\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399419 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-systemd\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399453 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c65j\" (UniqueName: \"kubernetes.io/projected/d9e225c1-9713-4720-8357-aaf7078a9c2d-kube-api-access-9c65j\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399482 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-kubelet-config\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399486 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-os-release\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399509 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48gxm\" (UniqueName: \"kubernetes.io/projected/807f61ad-b285-4b1d-b001-650d8ea8f622-kube-api-access-48gxm\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399533 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-sysconfig\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399579 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-multus-cni-dir\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399604 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-var-lib-kubelet\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.401460 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399638 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-var-lib-openvswitch\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.402048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399641 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-sysconfig\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.402048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399659 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-multus-cni-dir\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.402048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399720 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cf417d6-b90e-4570-8262-b67044850c51-host-var-lib-kubelet\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.402048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399797 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-socket-dir\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.402048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.399879 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4cf417d6-b90e-4570-8262-b67044850c51-cni-binary-copy\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.402048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.401111 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6e2a4fe1-ffb7-4e8e-8401-5dae97434c83-agent-certs\") pod \"konnectivity-agent-s2b7k\" (UID: \"6e2a4fe1-ffb7-4e8e-8401-5dae97434c83\") " pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:18.402048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.401273 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/723a6368-c49b-450c-9575-bc16b7c8b86f-tmp\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.402048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.401385 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/723a6368-c49b-450c-9575-bc16b7c8b86f-etc-tuned\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.411751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.411557 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8lvt\" (UniqueName: \"kubernetes.io/projected/cdcf5e3d-d86e-4839-ab5e-b1b382d5c832-kube-api-access-k8lvt\") pod \"aws-ebs-csi-driver-node-6zbs6\" (UID: \"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.411751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.411663 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqm6n\" (UniqueName: \"kubernetes.io/projected/002b7b51-3bad-42f0-b2ba-e4180da5923c-kube-api-access-mqm6n\") pod \"node-resolver-9wsxv\" (UID: \"002b7b51-3bad-42f0-b2ba-e4180da5923c\") " pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.412294 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.411754 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nqrk\" (UniqueName: \"kubernetes.io/projected/7069128e-a7fb-43e9-a858-e8e3250b2ac0-kube-api-access-2nqrk\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:18.412294 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.412026 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5t9\" (UniqueName: \"kubernetes.io/projected/4158dca8-c8a1-478c-92e2-6eff8a81fd54-kube-api-access-6x5t9\") pod \"iptables-alerter-f47hr\" (UID: \"4158dca8-c8a1-478c-92e2-6eff8a81fd54\") " pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.412513 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.412308 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96k2g\" (UniqueName: \"kubernetes.io/projected/4cf417d6-b90e-4570-8262-b67044850c51-kube-api-access-96k2g\") pod \"multus-zlnwc\" (UID: \"4cf417d6-b90e-4570-8262-b67044850c51\") " pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.412726 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.412698 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psvcp\" (UniqueName: \"kubernetes.io/projected/723a6368-c49b-450c-9575-bc16b7c8b86f-kube-api-access-psvcp\") pod \"tuned-w9rvd\" (UID: \"723a6368-c49b-450c-9575-bc16b7c8b86f\") " pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.412991 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.412974 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48gxm\" (UniqueName: \"kubernetes.io/projected/807f61ad-b285-4b1d-b001-650d8ea8f622-kube-api-access-48gxm\") pod \"multus-additional-cni-plugins-zcx5t\" (UID: \"807f61ad-b285-4b1d-b001-650d8ea8f622\") " pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.415713 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.415664 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal" event={"ID":"af2477b9db2ffbcd2bd186aed8c8adcf","Type":"ContainerStarted","Data":"ce755051b4939cf8ef00fccb7dc5ad6aa5aaad6fad841a5fba028d1cf05c5582"} Apr 22 15:08:18.416754 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.416728 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" event={"ID":"9d6856789bf25ffb38c205135a36931c","Type":"ContainerStarted","Data":"6ddef11987b050a45db38c481c7729fd12cd38d5a332e20f834d9cfb489d786b"} Apr 22 15:08:18.500012 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.499979 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3bf255c-11af-482e-b25f-be50be214aed-host\") pod \"node-ca-2hlwz\" (UID: \"d3bf255c-11af-482e-b25f-be50be214aed\") " pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.500012 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500015 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-slash\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500046 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-systemd-units\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500063 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-etc-openvswitch\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500087 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-cni-bin\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500105 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-log-socket\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500105 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d3bf255c-11af-482e-b25f-be50be214aed-host\") pod \"node-ca-2hlwz\" (UID: \"d3bf255c-11af-482e-b25f-be50be214aed\") " pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500125 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500158 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-systemd-units\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500169 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-slash\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500180 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-etc-openvswitch\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500175 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-cni-bin\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500177 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-log-socket\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500167 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9e225c1-9713-4720-8357-aaf7078a9c2d-ovnkube-config\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500305 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-dbus\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500341 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-run-openvswitch\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500381 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-run-ovn-kubernetes\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500412 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-run-netns\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500422 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-run-openvswitch\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500435 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-node-log\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500474 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-run-ovn-kubernetes\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500479 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-run-netns\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500484 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9e225c1-9713-4720-8357-aaf7078a9c2d-ovnkube-script-lib\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500514 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-node-log\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500525 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-dbus\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500533 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c65j\" (UniqueName: \"kubernetes.io/projected/d9e225c1-9713-4720-8357-aaf7078a9c2d-kube-api-access-9c65j\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500557 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-kubelet-config\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500583 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-var-lib-openvswitch\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500611 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-run-ovn\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500635 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.500751 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500640 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-var-lib-openvswitch\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500649 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-kubelet-config\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500668 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9e225c1-9713-4720-8357-aaf7078a9c2d-env-overrides\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500681 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-run-ovn\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500698 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9e225c1-9713-4720-8357-aaf7078a9c2d-ovn-node-metrics-cert\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500702 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500727 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-kubelet\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500761 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500824 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d3bf255c-11af-482e-b25f-be50be214aed-serviceca\") pod \"node-ca-2hlwz\" (UID: \"d3bf255c-11af-482e-b25f-be50be214aed\") " pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500833 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9e225c1-9713-4720-8357-aaf7078a9c2d-ovnkube-config\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500846 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-kubelet\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500864 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxkfv\" (UniqueName: \"kubernetes.io/projected/d3bf255c-11af-482e-b25f-be50be214aed-kube-api-access-sxkfv\") pod \"node-ca-2hlwz\" (UID: \"d3bf255c-11af-482e-b25f-be50be214aed\") " pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.500904 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-run-systemd\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.500930 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.501000 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret podName:7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:19.000977657 +0000 UTC m=+3.114926244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret") pod "global-pull-secret-syncer-qrz9r" (UID: "7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.501018 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-run-systemd\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.501021 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-cni-netd\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.501493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.501061 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9e225c1-9713-4720-8357-aaf7078a9c2d-ovnkube-script-lib\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.502069 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.501092 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9e225c1-9713-4720-8357-aaf7078a9c2d-env-overrides\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.502069 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.501112 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9e225c1-9713-4720-8357-aaf7078a9c2d-host-cni-netd\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.502069 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.501263 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d3bf255c-11af-482e-b25f-be50be214aed-serviceca\") pod \"node-ca-2hlwz\" (UID: \"d3bf255c-11af-482e-b25f-be50be214aed\") " pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.503261 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.503240 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9e225c1-9713-4720-8357-aaf7078a9c2d-ovn-node-metrics-cert\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.517918 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.517851 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:18.517918 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.517878 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:18.517918 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.517892 2606 projected.go:194] Error preparing data for projected volume kube-api-access-4wn8r for pod openshift-network-diagnostics/network-check-target-w6z28: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:18.518130 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.517948 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r podName:9331fcba-cdee-486e-b00b-7bb28c810ab9 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:19.017932608 +0000 UTC m=+3.131881203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4wn8r" (UniqueName: "kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r") pod "network-check-target-w6z28" (UID: "9331fcba-cdee-486e-b00b-7bb28c810ab9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:18.520397 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.520353 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxkfv\" (UniqueName: \"kubernetes.io/projected/d3bf255c-11af-482e-b25f-be50be214aed-kube-api-access-sxkfv\") pod \"node-ca-2hlwz\" (UID: \"d3bf255c-11af-482e-b25f-be50be214aed\") " pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.521660 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.521632 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c65j\" (UniqueName: \"kubernetes.io/projected/d9e225c1-9713-4720-8357-aaf7078a9c2d-kube-api-access-9c65j\") pod \"ovnkube-node-ck8h2\" (UID: \"d9e225c1-9713-4720-8357-aaf7078a9c2d\") " pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.587779 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.587748 2606 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:18.589773 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.589751 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:18.600233 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.600208 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" Apr 22 15:08:18.611965 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.611929 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" Apr 22 15:08:18.616629 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.616611 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" Apr 22 15:08:18.623210 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.623191 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zlnwc" Apr 22 15:08:18.631814 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.631793 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f47hr" Apr 22 15:08:18.638325 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.638307 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2hlwz" Apr 22 15:08:18.645076 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.645056 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:18.650671 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.650652 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9wsxv" Apr 22 15:08:18.904707 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:18.904618 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:18.904858 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.904740 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:18.904858 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:18.904812 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs podName:7069128e-a7fb-43e9-a858-e8e3250b2ac0 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:19.904793891 +0000 UTC m=+4.018742487 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs") pod "network-metrics-daemon-75v74" (UID: "7069128e-a7fb-43e9-a858-e8e3250b2ac0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:19.005482 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.005439 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:19.005662 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:19.005602 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:19.005709 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:19.005676 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret podName:7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:20.005658455 +0000 UTC m=+4.119607047 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret") pod "global-pull-secret-syncer-qrz9r" (UID: "7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:19.106179 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.106146 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:19.106324 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:19.106274 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:19.106324 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:19.106288 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:19.106324 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:19.106298 2606 projected.go:194] Error preparing data for projected volume kube-api-access-4wn8r for pod openshift-network-diagnostics/network-check-target-w6z28: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:19.106465 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:19.106346 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r podName:9331fcba-cdee-486e-b00b-7bb28c810ab9 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:20.106334212 +0000 UTC m=+4.220282794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4wn8r" (UniqueName: "kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r") pod "network-check-target-w6z28" (UID: "9331fcba-cdee-486e-b00b-7bb28c810ab9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:19.134299 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:19.134273 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod002b7b51_3bad_42f0_b2ba_e4180da5923c.slice/crio-1b536e1a84c71325ad21caa5a9374912b990fcb509f5f139bf9ed6a5dbef34b6 WatchSource:0}: Error finding container 1b536e1a84c71325ad21caa5a9374912b990fcb509f5f139bf9ed6a5dbef34b6: Status 404 returned error can't find the container with id 1b536e1a84c71325ad21caa5a9374912b990fcb509f5f139bf9ed6a5dbef34b6 Apr 22 15:08:19.136417 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:19.136356 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod807f61ad_b285_4b1d_b001_650d8ea8f622.slice/crio-4a820e1b1d3e1f82eee8fefe86399925d43c64f3b75b374bbcc236057101bb74 WatchSource:0}: Error finding container 4a820e1b1d3e1f82eee8fefe86399925d43c64f3b75b374bbcc236057101bb74: Status 404 returned error can't find the container with id 4a820e1b1d3e1f82eee8fefe86399925d43c64f3b75b374bbcc236057101bb74 Apr 22 15:08:19.136787 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:19.136762 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e2a4fe1_ffb7_4e8e_8401_5dae97434c83.slice/crio-bfe36ee230bcec0fc887c42ec376ff8cb463653307968425ebed459133b22fd0 WatchSource:0}: Error finding container bfe36ee230bcec0fc887c42ec376ff8cb463653307968425ebed459133b22fd0: Status 404 returned error can't find the container with id bfe36ee230bcec0fc887c42ec376ff8cb463653307968425ebed459133b22fd0 Apr 22 15:08:19.143262 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:19.143214 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcf5e3d_d86e_4839_ab5e_b1b382d5c832.slice/crio-e72f6231ed4bceca7d8e595186beb8fbe2affcc366971ce326b76db0944cb605 WatchSource:0}: Error finding container e72f6231ed4bceca7d8e595186beb8fbe2affcc366971ce326b76db0944cb605: Status 404 returned error can't find the container with id e72f6231ed4bceca7d8e595186beb8fbe2affcc366971ce326b76db0944cb605 Apr 22 15:08:19.144220 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:19.144193 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf417d6_b90e_4570_8262_b67044850c51.slice/crio-729a2d93250bc98e577ad3350edef3c3b19e6dbf692741ff5d01b235e7f92003 WatchSource:0}: Error finding container 729a2d93250bc98e577ad3350edef3c3b19e6dbf692741ff5d01b235e7f92003: Status 404 returned error can't find the container with id 729a2d93250bc98e577ad3350edef3c3b19e6dbf692741ff5d01b235e7f92003 Apr 22 15:08:19.145235 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:19.145136 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3bf255c_11af_482e_b25f_be50be214aed.slice/crio-0254c2ba78e5b9a5a6465985e593f347b0e5cdb70873642c05a90f44ade3a976 WatchSource:0}: Error finding container 0254c2ba78e5b9a5a6465985e593f347b0e5cdb70873642c05a90f44ade3a976: Status 404 returned error can't find the container with id 0254c2ba78e5b9a5a6465985e593f347b0e5cdb70873642c05a90f44ade3a976 Apr 22 15:08:19.146055 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:19.145958 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9e225c1_9713_4720_8357_aaf7078a9c2d.slice/crio-b7f9d33bf6097d0a3c53ad7380d6790f3c915987fec3b074b497da5999bf43f0 WatchSource:0}: Error finding container b7f9d33bf6097d0a3c53ad7380d6790f3c915987fec3b074b497da5999bf43f0: Status 404 returned error can't find the container with id b7f9d33bf6097d0a3c53ad7380d6790f3c915987fec3b074b497da5999bf43f0 Apr 22 15:08:19.146866 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:19.146770 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod723a6368_c49b_450c_9575_bc16b7c8b86f.slice/crio-bcf7a180d6268e1362d141ce376cbde78b406189188ae95f7986010f454193eb WatchSource:0}: Error finding container bcf7a180d6268e1362d141ce376cbde78b406189188ae95f7986010f454193eb: Status 404 returned error can't find the container with id bcf7a180d6268e1362d141ce376cbde78b406189188ae95f7986010f454193eb Apr 22 15:08:19.394750 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.394713 2606 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:03:17 +0000 UTC" deadline="2028-01-05 23:41:19.762617836 +0000 UTC" Apr 22 15:08:19.394750 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.394746 2606 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14960h33m0.367875982s" Apr 22 15:08:19.421179 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.421092 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal" event={"ID":"af2477b9db2ffbcd2bd186aed8c8adcf","Type":"ContainerStarted","Data":"a32f797cd4fed384b7096e4306de7e0547c563a92b861e639d5c39970316e713"} Apr 22 15:08:19.422202 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.422172 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" event={"ID":"723a6368-c49b-450c-9575-bc16b7c8b86f","Type":"ContainerStarted","Data":"bcf7a180d6268e1362d141ce376cbde78b406189188ae95f7986010f454193eb"} Apr 22 15:08:19.422972 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.422950 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerStarted","Data":"b7f9d33bf6097d0a3c53ad7380d6790f3c915987fec3b074b497da5999bf43f0"} Apr 22 15:08:19.423811 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.423790 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" event={"ID":"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832","Type":"ContainerStarted","Data":"e72f6231ed4bceca7d8e595186beb8fbe2affcc366971ce326b76db0944cb605"} Apr 22 15:08:19.424651 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.424628 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s2b7k" event={"ID":"6e2a4fe1-ffb7-4e8e-8401-5dae97434c83","Type":"ContainerStarted","Data":"bfe36ee230bcec0fc887c42ec376ff8cb463653307968425ebed459133b22fd0"} Apr 22 15:08:19.425465 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.425445 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" event={"ID":"807f61ad-b285-4b1d-b001-650d8ea8f622","Type":"ContainerStarted","Data":"4a820e1b1d3e1f82eee8fefe86399925d43c64f3b75b374bbcc236057101bb74"} Apr 22 15:08:19.426262 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.426240 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2hlwz" event={"ID":"d3bf255c-11af-482e-b25f-be50be214aed","Type":"ContainerStarted","Data":"0254c2ba78e5b9a5a6465985e593f347b0e5cdb70873642c05a90f44ade3a976"} Apr 22 15:08:19.427162 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.427144 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zlnwc" event={"ID":"4cf417d6-b90e-4570-8262-b67044850c51","Type":"ContainerStarted","Data":"729a2d93250bc98e577ad3350edef3c3b19e6dbf692741ff5d01b235e7f92003"} Apr 22 15:08:19.427994 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.427975 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f47hr" event={"ID":"4158dca8-c8a1-478c-92e2-6eff8a81fd54","Type":"ContainerStarted","Data":"303d83ac20275e9476e4a6925bbb0577de5509badff49710c773901361d580a2"} Apr 22 15:08:19.429115 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.429093 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9wsxv" event={"ID":"002b7b51-3bad-42f0-b2ba-e4180da5923c","Type":"ContainerStarted","Data":"1b536e1a84c71325ad21caa5a9374912b990fcb509f5f139bf9ed6a5dbef34b6"} Apr 22 15:08:19.445298 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.445246 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-188.ec2.internal" podStartSLOduration=2.445230203 podStartE2EDuration="2.445230203s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:08:19.44496362 +0000 UTC m=+3.558912224" watchObservedRunningTime="2026-04-22 15:08:19.445230203 +0000 UTC m=+3.559178807" Apr 22 15:08:19.522526 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.522301 2606 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:19.912562 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:19.912521 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:19.912748 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:19.912723 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:19.912816 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:19.912794 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs podName:7069128e-a7fb-43e9-a858-e8e3250b2ac0 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:21.91277175 +0000 UTC m=+6.026720355 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs") pod "network-metrics-daemon-75v74" (UID: "7069128e-a7fb-43e9-a858-e8e3250b2ac0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:20.013473 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:20.013441 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:20.013640 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:20.013623 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:20.013701 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:20.013688 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret podName:7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:22.013669732 +0000 UTC m=+6.127618332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret") pod "global-pull-secret-syncer-qrz9r" (UID: "7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:20.114692 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:20.114026 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:20.114692 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:20.114237 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:20.114692 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:20.114255 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:20.114692 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:20.114269 2606 projected.go:194] Error preparing data for projected volume kube-api-access-4wn8r for pod openshift-network-diagnostics/network-check-target-w6z28: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:20.114692 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:20.114330 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r podName:9331fcba-cdee-486e-b00b-7bb28c810ab9 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:22.114311257 +0000 UTC m=+6.228259842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4wn8r" (UniqueName: "kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r") pod "network-check-target-w6z28" (UID: "9331fcba-cdee-486e-b00b-7bb28c810ab9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:20.412516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:20.412014 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:20.412516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:20.412015 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:20.412516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:20.412131 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:20.412516 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:20.412151 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:20.412516 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:20.412250 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:20.412516 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:20.412325 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:20.438600 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:20.438566 2606 generic.go:358] "Generic (PLEG): container finished" podID="9d6856789bf25ffb38c205135a36931c" containerID="a9d0b50d339c2ffff12f7711bf392345a7fb98ae65bdff5900de197715f17796" exitCode=0 Apr 22 15:08:20.439554 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:20.439528 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" event={"ID":"9d6856789bf25ffb38c205135a36931c","Type":"ContainerDied","Data":"a9d0b50d339c2ffff12f7711bf392345a7fb98ae65bdff5900de197715f17796"} Apr 22 15:08:21.449045 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:21.448444 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" event={"ID":"9d6856789bf25ffb38c205135a36931c","Type":"ContainerStarted","Data":"5547489fb25b3d624c1c6449e2093e771c0afefe767d4e14154f34c41911d6fb"} Apr 22 15:08:21.931846 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:21.931719 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:21.932036 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:21.931861 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:21.932036 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:21.931938 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs podName:7069128e-a7fb-43e9-a858-e8e3250b2ac0 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:25.931916542 +0000 UTC m=+10.045865141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs") pod "network-metrics-daemon-75v74" (UID: "7069128e-a7fb-43e9-a858-e8e3250b2ac0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:22.032766 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:22.032659 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:22.032964 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:22.032815 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:22.032964 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:22.032885 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret podName:7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:26.032866755 +0000 UTC m=+10.146815344 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret") pod "global-pull-secret-syncer-qrz9r" (UID: "7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:22.133419 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:22.133381 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:22.133595 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:22.133563 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:22.133595 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:22.133584 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:22.133595 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:22.133596 2606 projected.go:194] Error preparing data for projected volume kube-api-access-4wn8r for pod openshift-network-diagnostics/network-check-target-w6z28: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:22.133773 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:22.133667 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r podName:9331fcba-cdee-486e-b00b-7bb28c810ab9 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:26.133648271 +0000 UTC m=+10.247596877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4wn8r" (UniqueName: "kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r") pod "network-check-target-w6z28" (UID: "9331fcba-cdee-486e-b00b-7bb28c810ab9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:22.413215 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:22.412478 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:22.413215 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:22.412514 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:22.413215 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:22.412613 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:22.413215 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:22.413045 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:22.413215 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:22.413097 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:22.413215 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:22.413181 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:24.413315 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:24.412608 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:24.413315 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:24.412744 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:24.413315 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:24.413095 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:24.413315 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:24.413155 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:24.413315 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:24.413258 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:24.413978 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:24.413343 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:25.966797 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:25.966138 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:25.966797 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:25.966318 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:25.966797 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:25.966400 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs podName:7069128e-a7fb-43e9-a858-e8e3250b2ac0 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:33.966380319 +0000 UTC m=+18.080328901 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs") pod "network-metrics-daemon-75v74" (UID: "7069128e-a7fb-43e9-a858-e8e3250b2ac0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:26.066901 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:26.066867 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:26.067080 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:26.067028 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:26.067139 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:26.067102 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret podName:7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:34.06708367 +0000 UTC m=+18.181032273 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret") pod "global-pull-secret-syncer-qrz9r" (UID: "7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:26.168597 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:26.167959 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:26.168597 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:26.168122 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:26.168597 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:26.168140 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:26.168597 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:26.168152 2606 projected.go:194] Error preparing data for projected volume kube-api-access-4wn8r for pod openshift-network-diagnostics/network-check-target-w6z28: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:26.168597 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:26.168213 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r podName:9331fcba-cdee-486e-b00b-7bb28c810ab9 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:34.168194344 +0000 UTC m=+18.282142941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4wn8r" (UniqueName: "kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r") pod "network-check-target-w6z28" (UID: "9331fcba-cdee-486e-b00b-7bb28c810ab9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:26.412572 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:26.412534 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:26.412758 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:26.412670 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:26.418378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:26.418329 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:26.418515 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:26.418455 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:26.418515 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:26.418472 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:26.418631 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:26.418607 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:28.412020 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:28.411942 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:28.412494 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:28.411949 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:28.412494 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:28.412063 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:28.412494 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:28.412149 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:28.412494 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:28.411949 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:28.412494 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:28.412256 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:30.412143 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:30.412113 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:30.412609 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:30.412216 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:30.412609 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:30.412230 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:30.412609 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:30.412316 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:30.412609 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:30.412378 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:30.412609 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:30.412471 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:32.412142 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:32.412058 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:32.412142 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:32.412099 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:32.412651 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:32.412067 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:32.412651 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:32.412201 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:32.412651 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:32.412299 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:32.412651 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:32.412410 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:34.021477 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:34.021237 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:34.021971 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.021422 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:34.021971 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.021573 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs podName:7069128e-a7fb-43e9-a858-e8e3250b2ac0 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:50.02155138 +0000 UTC m=+34.135499966 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs") pod "network-metrics-daemon-75v74" (UID: "7069128e-a7fb-43e9-a858-e8e3250b2ac0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:34.121941 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:34.121907 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:34.122121 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.122073 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:34.122176 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.122166 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret podName:7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:50.122145722 +0000 UTC m=+34.236094305 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret") pod "global-pull-secret-syncer-qrz9r" (UID: "7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:34.222947 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:34.222909 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:34.223136 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.223047 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:34.223136 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.223072 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:34.223136 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.223085 2606 projected.go:194] Error preparing data for projected volume kube-api-access-4wn8r for pod openshift-network-diagnostics/network-check-target-w6z28: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:34.223284 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.223154 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r podName:9331fcba-cdee-486e-b00b-7bb28c810ab9 nodeName:}" failed. No retries permitted until 2026-04-22 15:08:50.223135302 +0000 UTC m=+34.337083891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4wn8r" (UniqueName: "kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r") pod "network-check-target-w6z28" (UID: "9331fcba-cdee-486e-b00b-7bb28c810ab9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:34.411933 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:34.411854 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:34.412081 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:34.411854 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:34.412081 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.411975 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:34.412081 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.412019 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:34.412081 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:34.411854 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:34.412274 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:34.412101 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:36.415867 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:36.414994 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:36.415867 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:36.415127 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:36.415867 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:36.415580 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:36.415867 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:36.415678 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:36.415867 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:36.415749 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:36.415867 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:36.415817 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:37.479353 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.479119 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" event={"ID":"723a6368-c49b-450c-9575-bc16b7c8b86f","Type":"ContainerStarted","Data":"4b953c2e4239aba2aac20641eabcc1d4b0fa16c1d494d496224da5622c9db90a"} Apr 22 15:08:37.482522 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.482500 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:08:37.482868 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.482838 2606 generic.go:358] "Generic (PLEG): container finished" podID="d9e225c1-9713-4720-8357-aaf7078a9c2d" containerID="41b19a0488df99f49d375a4fd32b9f67f2b9e9d0adf650d725b9cbdbe0ba810e" exitCode=1 Apr 22 15:08:37.482976 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.482905 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerStarted","Data":"58b1b417d20a843be76dafb5f61a28fa321513c9bd16b71e2901182857425d92"} Apr 22 15:08:37.482976 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.482945 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerStarted","Data":"66563233c697dd39140d578334bbb2d88c2ea0d2881825ac4f23433c1562933b"} Apr 22 15:08:37.482976 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.482958 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerStarted","Data":"4fb1c4dfa9bafdf59ff4ade1534af4b9073f1cff2e03031461df16e67d9ecaac"} Apr 22 15:08:37.482976 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.482970 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerStarted","Data":"a4327743ff809cb908d05d13b2ac15e455d752b17cfbca5e51aa3fb0a034ed07"} Apr 22 15:08:37.483164 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.482981 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerDied","Data":"41b19a0488df99f49d375a4fd32b9f67f2b9e9d0adf650d725b9cbdbe0ba810e"} Apr 22 15:08:37.483164 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.483001 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerStarted","Data":"e467d1e06939526f21672c12285795d9e69ea43977b75cfff2f8fbfd3870905b"} Apr 22 15:08:37.484442 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.484416 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" event={"ID":"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832","Type":"ContainerStarted","Data":"a2344b02dfabce2b7d1bcca270df6f247461a654adb9a547de6d4199edb93b8c"} Apr 22 15:08:37.485761 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.485713 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s2b7k" event={"ID":"6e2a4fe1-ffb7-4e8e-8401-5dae97434c83","Type":"ContainerStarted","Data":"34e764a6a924e1ef51e610bff40152030f32d3ef3f69d261d6fbcf7cd01bce6f"} Apr 22 15:08:37.487262 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.487233 2606 generic.go:358] "Generic (PLEG): container finished" podID="807f61ad-b285-4b1d-b001-650d8ea8f622" containerID="ac956649f4d297030567962a1c134f92e685a7e5e005ae67a8a90d7eff78f52e" exitCode=0 Apr 22 15:08:37.487379 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.487280 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" event={"ID":"807f61ad-b285-4b1d-b001-650d8ea8f622","Type":"ContainerDied","Data":"ac956649f4d297030567962a1c134f92e685a7e5e005ae67a8a90d7eff78f52e"} Apr 22 15:08:37.488640 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.488582 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2hlwz" event={"ID":"d3bf255c-11af-482e-b25f-be50be214aed","Type":"ContainerStarted","Data":"0a14af04967779d3010de6bb5980898e81cb8914bc12e7b84d78b7286ab2240d"} Apr 22 15:08:37.490397 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.490373 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zlnwc" event={"ID":"4cf417d6-b90e-4570-8262-b67044850c51","Type":"ContainerStarted","Data":"707c46b6be488dd592ba0c6c362262ab042b4a4b37c983e691dc878988f4ca57"} Apr 22 15:08:37.491755 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.491730 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9wsxv" event={"ID":"002b7b51-3bad-42f0-b2ba-e4180da5923c","Type":"ContainerStarted","Data":"b4c11f34543b06e3815976e63cdda68dede41b0626174ef981bc520681fe8c4c"} Apr 22 15:08:37.500453 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.500349 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-188.ec2.internal" podStartSLOduration=20.500336435 podStartE2EDuration="20.500336435s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:08:21.477229452 +0000 UTC m=+5.591178058" watchObservedRunningTime="2026-04-22 15:08:37.500336435 +0000 UTC m=+21.614285039" Apr 22 15:08:37.500659 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.500622 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-w9rvd" podStartSLOduration=4.340597155 podStartE2EDuration="21.500612374s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.149264997 +0000 UTC m=+3.263213619" lastFinishedPulling="2026-04-22 15:08:36.309280248 +0000 UTC m=+20.423228838" observedRunningTime="2026-04-22 15:08:37.499705481 +0000 UTC m=+21.613654106" watchObservedRunningTime="2026-04-22 15:08:37.500612374 +0000 UTC m=+21.614560977" Apr 22 15:08:37.516851 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.516806 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9wsxv" podStartSLOduration=4.345417848 podStartE2EDuration="21.516794341s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.136010734 +0000 UTC m=+3.249959318" lastFinishedPulling="2026-04-22 15:08:36.307387214 +0000 UTC m=+20.421335811" observedRunningTime="2026-04-22 15:08:37.515908501 +0000 UTC m=+21.629857120" watchObservedRunningTime="2026-04-22 15:08:37.516794341 +0000 UTC m=+21.630742942" Apr 22 15:08:37.535971 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.535727 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-s2b7k" podStartSLOduration=4.391454734 podStartE2EDuration="21.535711305s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.138854336 +0000 UTC m=+3.252802933" lastFinishedPulling="2026-04-22 15:08:36.283110904 +0000 UTC m=+20.397059504" observedRunningTime="2026-04-22 15:08:37.535493919 +0000 UTC m=+21.649442522" watchObservedRunningTime="2026-04-22 15:08:37.535711305 +0000 UTC m=+21.649659910" Apr 22 15:08:37.568485 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.568454 2606 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:08:37.582529 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.582489 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zlnwc" podStartSLOduration=4.132055181 podStartE2EDuration="21.582476549s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.146345666 +0000 UTC m=+3.260294247" lastFinishedPulling="2026-04-22 15:08:36.596767013 +0000 UTC m=+20.710715615" observedRunningTime="2026-04-22 15:08:37.581599729 +0000 UTC m=+21.695548334" watchObservedRunningTime="2026-04-22 15:08:37.582476549 +0000 UTC m=+21.696425153" Apr 22 15:08:37.597789 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:37.597752 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2hlwz" podStartSLOduration=9.187022639 podStartE2EDuration="21.597740312s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.147277856 +0000 UTC m=+3.261226451" lastFinishedPulling="2026-04-22 15:08:31.557995529 +0000 UTC m=+15.671944124" observedRunningTime="2026-04-22 15:08:37.597320851 +0000 UTC m=+21.711269465" watchObservedRunningTime="2026-04-22 15:08:37.597740312 +0000 UTC m=+21.711688943" Apr 22 15:08:38.364289 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:38.364169 2606 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:08:37.568479817Z","UUID":"11292cdf-bf80-417a-b784-d193bff9aa57","Handler":null,"Name":"","Endpoint":""} Apr 22 15:08:38.366820 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:38.366790 2606 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:08:38.366820 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:38.366823 2606 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:08:38.412654 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:38.412626 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:38.412835 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:38.412626 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:38.412835 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:38.412762 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:38.412969 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:38.412835 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:38.412969 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:38.412845 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:38.412969 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:38.412956 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:38.496187 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:38.496144 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" event={"ID":"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832","Type":"ContainerStarted","Data":"fc9f5d713a21db932e9cb0a83f4c4165a6f307f5fed636cb684690060fc76967"} Apr 22 15:08:38.497717 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:38.497686 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f47hr" event={"ID":"4158dca8-c8a1-478c-92e2-6eff8a81fd54","Type":"ContainerStarted","Data":"55b3f7c294529b9591d12d397083ba4081e85461ded85d2ec58b1b68ccb2c56c"} Apr 22 15:08:38.516409 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:38.516348 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f47hr" podStartSLOduration=5.35221922 podStartE2EDuration="22.516335325s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.143307856 +0000 UTC m=+3.257256453" lastFinishedPulling="2026-04-22 15:08:36.307423971 +0000 UTC m=+20.421372558" observedRunningTime="2026-04-22 15:08:38.516233186 +0000 UTC m=+22.630181789" watchObservedRunningTime="2026-04-22 15:08:38.516335325 +0000 UTC m=+22.630283929" Apr 22 15:08:39.477835 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:39.477798 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:39.478504 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:39.478481 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:39.503325 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:39.503293 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:08:39.503978 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:39.503715 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerStarted","Data":"de5c7f68d29fd4e41b30bfe397500bbecefbf2373bc576d680ef4164a88bd335"} Apr 22 15:08:39.505804 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:39.505771 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" event={"ID":"cdcf5e3d-d86e-4839-ab5e-b1b382d5c832","Type":"ContainerStarted","Data":"702d9db55e1ab2ad8e9b293145acceddac6bc6600ddbda14cfbec803a2b49ec1"} Apr 22 15:08:39.535820 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:39.535776 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6zbs6" podStartSLOduration=4.099298094 podStartE2EDuration="23.535762292s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.145642942 +0000 UTC m=+3.259591541" lastFinishedPulling="2026-04-22 15:08:38.582107156 +0000 UTC m=+22.696055739" observedRunningTime="2026-04-22 15:08:39.535236395 +0000 UTC m=+23.649185000" watchObservedRunningTime="2026-04-22 15:08:39.535762292 +0000 UTC m=+23.649710893" Apr 22 15:08:40.412202 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:40.412166 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:40.412413 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:40.412166 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:40.412413 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:40.412279 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:40.412551 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:40.412167 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:40.412551 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:40.412416 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:40.412551 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:40.412527 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:40.507697 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:40.507670 2606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:08:42.412473 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.412256 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:42.412985 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.412256 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:42.412985 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:42.412589 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:42.412985 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.412256 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:42.412985 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:42.412619 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:42.412985 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:42.412694 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:42.514269 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.514243 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:08:42.514606 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.514583 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerStarted","Data":"f110af573a75196c38becfc57826c344734d7bdfa6ec80ea0190dbe624bb3a3e"} Apr 22 15:08:42.514965 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.514936 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:42.515065 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.514978 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:42.515065 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.515055 2606 scope.go:117] "RemoveContainer" containerID="41b19a0488df99f49d375a4fd32b9f67f2b9e9d0adf650d725b9cbdbe0ba810e" Apr 22 15:08:42.516469 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.516433 2606 generic.go:358] "Generic (PLEG): container finished" podID="807f61ad-b285-4b1d-b001-650d8ea8f622" containerID="5776f74488d813308c4a207c84dc023fe54beb2d153584015acd107ec244e277" exitCode=0 Apr 22 15:08:42.516529 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.516468 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" event={"ID":"807f61ad-b285-4b1d-b001-650d8ea8f622","Type":"ContainerDied","Data":"5776f74488d813308c4a207c84dc023fe54beb2d153584015acd107ec244e277"} Apr 22 15:08:42.530187 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:42.530163 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:43.521354 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:43.521328 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:08:43.521733 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:43.521652 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" event={"ID":"d9e225c1-9713-4720-8357-aaf7078a9c2d","Type":"ContainerStarted","Data":"7be763ea40b5aeee892558993d0d0fb5a08fc524950d6140b18e14d6f4df68da"} Apr 22 15:08:43.521880 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:43.521854 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:43.543115 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:43.543090 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:08:43.563836 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:43.563792 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" podStartSLOduration=10.360540238 podStartE2EDuration="27.563778194s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.1481084 +0000 UTC m=+3.262056984" lastFinishedPulling="2026-04-22 15:08:36.351346346 +0000 UTC m=+20.465294940" observedRunningTime="2026-04-22 15:08:43.561580509 +0000 UTC m=+27.675529142" watchObservedRunningTime="2026-04-22 15:08:43.563778194 +0000 UTC m=+27.677726799" Apr 22 15:08:44.412153 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:44.412120 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:44.412472 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:44.412127 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:44.412472 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:44.412127 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:44.412472 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:44.412303 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:44.412472 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:44.412381 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:44.412472 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:44.412214 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:44.524871 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:44.524841 2606 generic.go:358] "Generic (PLEG): container finished" podID="807f61ad-b285-4b1d-b001-650d8ea8f622" containerID="76931ef5baa8ba1c06dff339c3666bec352ae247e7cac671a6bcf4a7015fd080" exitCode=0 Apr 22 15:08:44.525324 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:44.524931 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" event={"ID":"807f61ad-b285-4b1d-b001-650d8ea8f622","Type":"ContainerDied","Data":"76931ef5baa8ba1c06dff339c3666bec352ae247e7cac671a6bcf4a7015fd080"} Apr 22 15:08:45.528251 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:45.528219 2606 generic.go:358] "Generic (PLEG): container finished" podID="807f61ad-b285-4b1d-b001-650d8ea8f622" containerID="29b4fbace05d31aa56b1f9e6ee6339bcc2dd4d586ddca64c8c5e2167fcdf272f" exitCode=0 Apr 22 15:08:45.528622 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:45.528297 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" event={"ID":"807f61ad-b285-4b1d-b001-650d8ea8f622","Type":"ContainerDied","Data":"29b4fbace05d31aa56b1f9e6ee6339bcc2dd4d586ddca64c8c5e2167fcdf272f"} Apr 22 15:08:46.413593 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:46.413561 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:46.413785 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:46.413648 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:46.413785 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:46.413686 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:46.413785 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:46.413704 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:46.413959 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:46.413793 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:46.413959 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:46.413857 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:46.943369 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:46.943324 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:46.943787 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:46.943489 2606 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:08:46.944385 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:46.944350 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-s2b7k" Apr 22 15:08:48.411919 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:48.411877 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:48.411919 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:48.411908 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:48.412478 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:48.411884 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:48.412478 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:48.411998 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:48.412478 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:48.412120 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:48.412478 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:48.412203 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:50.041968 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:50.041034 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:50.041968 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.041178 2606 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:50.041968 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.041238 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs podName:7069128e-a7fb-43e9-a858-e8e3250b2ac0 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:22.041218729 +0000 UTC m=+66.155167324 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs") pod "network-metrics-daemon-75v74" (UID: "7069128e-a7fb-43e9-a858-e8e3250b2ac0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:50.142176 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:50.142142 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:50.142597 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.142302 2606 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:50.142597 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.142385 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret podName:7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:22.142347388 +0000 UTC m=+66.256295994 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret") pod "global-pull-secret-syncer-qrz9r" (UID: "7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:50.243037 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:50.242999 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:50.243239 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.243138 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:50.243239 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.243153 2606 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:50.243239 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.243166 2606 projected.go:194] Error preparing data for projected volume kube-api-access-4wn8r for pod openshift-network-diagnostics/network-check-target-w6z28: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:50.243239 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.243236 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r podName:9331fcba-cdee-486e-b00b-7bb28c810ab9 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:22.243216607 +0000 UTC m=+66.357165202 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4wn8r" (UniqueName: "kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r") pod "network-check-target-w6z28" (UID: "9331fcba-cdee-486e-b00b-7bb28c810ab9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:50.412618 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:50.412545 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:50.412775 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:50.412544 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:50.412775 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.412676 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:50.412775 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:50.412555 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:50.412934 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.412797 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:50.412934 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:50.412877 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:51.091833 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:51.091784 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qrz9r"] Apr 22 15:08:51.092690 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:51.091936 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:51.092690 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:51.092050 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:51.097077 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:51.097048 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-75v74"] Apr 22 15:08:51.097231 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:51.097198 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:51.097345 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:51.097311 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:51.097345 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:51.097333 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w6z28"] Apr 22 15:08:51.097640 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:51.097460 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:51.097640 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:51.097551 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:52.412312 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:52.412118 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:52.412740 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:52.412169 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:52.412740 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:52.412403 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:52.412740 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:52.412500 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:52.544559 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:52.544529 2606 generic.go:358] "Generic (PLEG): container finished" podID="807f61ad-b285-4b1d-b001-650d8ea8f622" containerID="c1b730ae9730af728cdb25c28097d95cd1e904a551d072efff25b030ab3e9b29" exitCode=0 Apr 22 15:08:52.544713 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:52.544571 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" event={"ID":"807f61ad-b285-4b1d-b001-650d8ea8f622","Type":"ContainerDied","Data":"c1b730ae9730af728cdb25c28097d95cd1e904a551d072efff25b030ab3e9b29"} Apr 22 15:08:53.412219 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:53.412188 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:53.412417 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:53.412290 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-75v74" podUID="7069128e-a7fb-43e9-a858-e8e3250b2ac0" Apr 22 15:08:53.549168 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:53.549133 2606 generic.go:358] "Generic (PLEG): container finished" podID="807f61ad-b285-4b1d-b001-650d8ea8f622" containerID="fe70b76f2dd912b909cfc0c7d6ea671fb61e251dac3ffa4b0d01a7a5fb31cde2" exitCode=0 Apr 22 15:08:53.549353 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:53.549177 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" event={"ID":"807f61ad-b285-4b1d-b001-650d8ea8f622","Type":"ContainerDied","Data":"fe70b76f2dd912b909cfc0c7d6ea671fb61e251dac3ffa4b0d01a7a5fb31cde2"} Apr 22 15:08:54.412344 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:54.412315 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:54.412520 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:54.412444 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w6z28" podUID="9331fcba-cdee-486e-b00b-7bb28c810ab9" Apr 22 15:08:54.412520 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:54.412487 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:54.412821 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:08:54.412544 2606 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qrz9r" podUID="7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281" Apr 22 15:08:54.554379 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:54.554339 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" event={"ID":"807f61ad-b285-4b1d-b001-650d8ea8f622","Type":"ContainerStarted","Data":"2ff297803a620c454e579bc028e5a93ab56235c5f541428077285e4f4e70451e"} Apr 22 15:08:54.581321 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:54.581264 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zcx5t" podStartSLOduration=6.18686823 podStartE2EDuration="38.581249839s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:08:19.138118802 +0000 UTC m=+3.252067385" lastFinishedPulling="2026-04-22 15:08:51.532500398 +0000 UTC m=+35.646448994" observedRunningTime="2026-04-22 15:08:54.581220704 +0000 UTC m=+38.695169308" watchObservedRunningTime="2026-04-22 15:08:54.581249839 +0000 UTC m=+38.695198437" Apr 22 15:08:55.219463 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.219430 2606 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-188.ec2.internal" event="NodeReady" Apr 22 15:08:55.219601 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.219546 2606 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:08:55.310847 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.310759 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lb9c7"] Apr 22 15:08:55.346048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.346019 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xgdfl"] Apr 22 15:08:55.346241 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.346186 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.356682 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.356660 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:08:55.356909 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.356893 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-62nkc\"" Apr 22 15:08:55.356970 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.356925 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:08:55.369614 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.369588 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ndg26"] Apr 22 15:08:55.369790 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.369774 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.381917 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.381894 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 15:08:55.382349 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.382329 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 15:08:55.382491 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.382385 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 15:08:55.382702 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.382687 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dmx2q\"" Apr 22 15:08:55.388214 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.388198 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 15:08:55.391008 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.390991 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xgdfl"] Apr 22 15:08:55.391049 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.391014 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lb9c7"] Apr 22 15:08:55.391049 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.391022 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ndg26"] Apr 22 15:08:55.391127 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.391113 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ndg26" Apr 22 15:08:55.395998 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.395979 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xqxh9\"" Apr 22 15:08:55.396604 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.396585 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:08:55.396690 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.396589 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:08:55.398108 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.398090 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:08:55.411842 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.411821 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:08:55.414992 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.414966 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:08:55.415352 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.415231 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pdgcv\"" Apr 22 15:08:55.481877 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.481846 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.481877 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.481886 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-config-volume\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.482088 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.481963 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdqhl\" (UniqueName: \"kubernetes.io/projected/b4b5d2c6-67fb-4e8b-b072-9a3d47f86162-kube-api-access-tdqhl\") pod \"ingress-canary-ndg26\" (UID: \"b4b5d2c6-67fb-4e8b-b072-9a3d47f86162\") " pod="openshift-ingress-canary/ingress-canary-ndg26" Apr 22 15:08:55.482088 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.482004 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxkvx\" (UniqueName: \"kubernetes.io/projected/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-kube-api-access-gxkvx\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.482088 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.482022 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4b5d2c6-67fb-4e8b-b072-9a3d47f86162-cert\") pod \"ingress-canary-ndg26\" (UID: \"b4b5d2c6-67fb-4e8b-b072-9a3d47f86162\") " pod="openshift-ingress-canary/ingress-canary-ndg26" Apr 22 15:08:55.482088 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.482059 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-metrics-tls\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.482088 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.482079 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-tmp-dir\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.482319 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.482097 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dv9z\" (UniqueName: \"kubernetes.io/projected/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-kube-api-access-5dv9z\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.482319 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.482115 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.482319 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.482225 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-crio-socket\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.482319 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.482281 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-data-volume\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.583169 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583084 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxkvx\" (UniqueName: \"kubernetes.io/projected/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-kube-api-access-gxkvx\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.583169 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583118 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4b5d2c6-67fb-4e8b-b072-9a3d47f86162-cert\") pod \"ingress-canary-ndg26\" (UID: \"b4b5d2c6-67fb-4e8b-b072-9a3d47f86162\") " pod="openshift-ingress-canary/ingress-canary-ndg26" Apr 22 15:08:55.583169 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583146 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-metrics-tls\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.583169 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583163 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-tmp-dir\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.583525 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583182 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dv9z\" (UniqueName: \"kubernetes.io/projected/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-kube-api-access-5dv9z\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.583525 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583199 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.583525 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583243 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-crio-socket\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.583525 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583283 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-data-volume\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.583525 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583304 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.583525 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583322 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-config-volume\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.583525 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583422 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-crio-socket\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.583525 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583484 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdqhl\" (UniqueName: \"kubernetes.io/projected/b4b5d2c6-67fb-4e8b-b072-9a3d47f86162-kube-api-access-tdqhl\") pod \"ingress-canary-ndg26\" (UID: \"b4b5d2c6-67fb-4e8b-b072-9a3d47f86162\") " pod="openshift-ingress-canary/ingress-canary-ndg26" Apr 22 15:08:55.583858 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.583738 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-data-volume\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.584053 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.584026 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.586946 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.586925 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.587097 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.587081 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4b5d2c6-67fb-4e8b-b072-9a3d47f86162-cert\") pod \"ingress-canary-ndg26\" (UID: \"b4b5d2c6-67fb-4e8b-b072-9a3d47f86162\") " pod="openshift-ingress-canary/ingress-canary-ndg26" Apr 22 15:08:55.593162 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.593141 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxkvx\" (UniqueName: \"kubernetes.io/projected/8b72d438-c2b7-4709-a0a5-3c11f2a7894e-kube-api-access-gxkvx\") pod \"insights-runtime-extractor-xgdfl\" (UID: \"8b72d438-c2b7-4709-a0a5-3c11f2a7894e\") " pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.594266 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.594246 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdqhl\" (UniqueName: \"kubernetes.io/projected/b4b5d2c6-67fb-4e8b-b072-9a3d47f86162-kube-api-access-tdqhl\") pod \"ingress-canary-ndg26\" (UID: \"b4b5d2c6-67fb-4e8b-b072-9a3d47f86162\") " pod="openshift-ingress-canary/ingress-canary-ndg26" Apr 22 15:08:55.598485 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.598459 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-tmp-dir\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.598763 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.598735 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-config-volume\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.600032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.600012 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-metrics-tls\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.600282 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.600264 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dv9z\" (UniqueName: \"kubernetes.io/projected/e0a9849c-6b92-4aa1-b14f-9246ef0c29f3-kube-api-access-5dv9z\") pod \"dns-default-lb9c7\" (UID: \"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3\") " pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.655379 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.655324 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:55.679155 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.679125 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xgdfl" Apr 22 15:08:55.698999 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.698969 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ndg26" Apr 22 15:08:55.846269 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.846194 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xgdfl"] Apr 22 15:08:55.850897 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:55.850869 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b72d438_c2b7_4709_a0a5_3c11f2a7894e.slice/crio-6a5ab9e5be4e68516807d0da0958a3a390d37d9f7d1b7563d92a12fa66e1bc94 WatchSource:0}: Error finding container 6a5ab9e5be4e68516807d0da0958a3a390d37d9f7d1b7563d92a12fa66e1bc94: Status 404 returned error can't find the container with id 6a5ab9e5be4e68516807d0da0958a3a390d37d9f7d1b7563d92a12fa66e1bc94 Apr 22 15:08:55.869950 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.869919 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lb9c7"] Apr 22 15:08:55.872551 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:55.872522 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0a9849c_6b92_4aa1_b14f_9246ef0c29f3.slice/crio-3aef686f6b689100a81b92a10c082a3e5fe4a633ec229633bed66278ed3caf4a WatchSource:0}: Error finding container 3aef686f6b689100a81b92a10c082a3e5fe4a633ec229633bed66278ed3caf4a: Status 404 returned error can't find the container with id 3aef686f6b689100a81b92a10c082a3e5fe4a633ec229633bed66278ed3caf4a Apr 22 15:08:55.873699 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:55.873677 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ndg26"] Apr 22 15:08:55.876595 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:55.876574 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4b5d2c6_67fb_4e8b_b072_9a3d47f86162.slice/crio-5b608e0b21115ac2d240e64f3561d2e7bab2d1b7c90d9f2c7ccb6a37992894ae WatchSource:0}: Error finding container 5b608e0b21115ac2d240e64f3561d2e7bab2d1b7c90d9f2c7ccb6a37992894ae: Status 404 returned error can't find the container with id 5b608e0b21115ac2d240e64f3561d2e7bab2d1b7c90d9f2c7ccb6a37992894ae Apr 22 15:08:56.428710 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.428108 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:08:56.428710 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.428245 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:08:56.431198 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.430902 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:08:56.431198 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.430985 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:08:56.432039 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.432017 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lqzj2\"" Apr 22 15:08:56.432141 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.432081 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:08:56.561415 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.561379 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xgdfl" event={"ID":"8b72d438-c2b7-4709-a0a5-3c11f2a7894e","Type":"ContainerStarted","Data":"0e8bd4573da95b6d6102c933be12b11770800593bded1a39cf6b855aac6324a3"} Apr 22 15:08:56.561598 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.561424 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xgdfl" event={"ID":"8b72d438-c2b7-4709-a0a5-3c11f2a7894e","Type":"ContainerStarted","Data":"6a5ab9e5be4e68516807d0da0958a3a390d37d9f7d1b7563d92a12fa66e1bc94"} Apr 22 15:08:56.563243 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.563216 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ndg26" event={"ID":"b4b5d2c6-67fb-4e8b-b072-9a3d47f86162","Type":"ContainerStarted","Data":"5b608e0b21115ac2d240e64f3561d2e7bab2d1b7c90d9f2c7ccb6a37992894ae"} Apr 22 15:08:56.564468 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:56.564408 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lb9c7" event={"ID":"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3","Type":"ContainerStarted","Data":"3aef686f6b689100a81b92a10c082a3e5fe4a633ec229633bed66278ed3caf4a"} Apr 22 15:08:57.466538 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.466497 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68766dc794-c4wp6"] Apr 22 15:08:57.490046 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.490006 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.493654 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.493574 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 15:08:57.494038 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.494018 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 15:08:57.494225 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.494195 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 15:08:57.494310 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.494254 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 15:08:57.494310 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.494262 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jtqk8\"" Apr 22 15:08:57.494445 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.494205 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 15:08:57.494445 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.494195 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 15:08:57.494789 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.494753 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 15:08:57.500481 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.500438 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68766dc794-c4wp6"] Apr 22 15:08:57.600209 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.600181 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-oauth-config\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.600351 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.600219 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-serving-cert\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.600351 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.600245 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-service-ca\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.600351 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.600285 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74nnz\" (UniqueName: \"kubernetes.io/projected/a1e7115d-ddcd-45ec-90c8-5b49f1464536-kube-api-access-74nnz\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.600351 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.600341 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-config\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.600554 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.600387 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-oauth-serving-cert\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.701283 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.701228 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-oauth-serving-cert\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.701454 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.701385 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-oauth-config\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.701454 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.701427 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-serving-cert\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.701588 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.701453 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-service-ca\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.701588 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.701480 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74nnz\" (UniqueName: \"kubernetes.io/projected/a1e7115d-ddcd-45ec-90c8-5b49f1464536-kube-api-access-74nnz\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.701588 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.701539 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-config\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.702493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.702467 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-config\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.702623 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.702471 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-service-ca\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.704351 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.704322 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-oauth-config\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.704504 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.704333 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-serving-cert\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.706083 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.706053 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-oauth-serving-cert\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.713464 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.713434 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74nnz\" (UniqueName: \"kubernetes.io/projected/a1e7115d-ddcd-45ec-90c8-5b49f1464536-kube-api-access-74nnz\") pod \"console-68766dc794-c4wp6\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:57.801521 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:57.801482 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:08:58.676227 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:58.676195 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68766dc794-c4wp6"] Apr 22 15:08:58.680897 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:08:58.680866 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1e7115d_ddcd_45ec_90c8_5b49f1464536.slice/crio-751b24fb463bf40ee8196eee4a7fbdedd8e0fdc78cbd51d758f7c3090fef42ba WatchSource:0}: Error finding container 751b24fb463bf40ee8196eee4a7fbdedd8e0fdc78cbd51d758f7c3090fef42ba: Status 404 returned error can't find the container with id 751b24fb463bf40ee8196eee4a7fbdedd8e0fdc78cbd51d758f7c3090fef42ba Apr 22 15:08:59.575778 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:59.575692 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xgdfl" event={"ID":"8b72d438-c2b7-4709-a0a5-3c11f2a7894e","Type":"ContainerStarted","Data":"f85b1fc30b01feae6ec5423fa7c5aabf047c4a8eb1fc955aee473ffba607fea8"} Apr 22 15:08:59.576995 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:59.576968 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68766dc794-c4wp6" event={"ID":"a1e7115d-ddcd-45ec-90c8-5b49f1464536","Type":"ContainerStarted","Data":"751b24fb463bf40ee8196eee4a7fbdedd8e0fdc78cbd51d758f7c3090fef42ba"} Apr 22 15:08:59.578752 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:59.578726 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ndg26" event={"ID":"b4b5d2c6-67fb-4e8b-b072-9a3d47f86162","Type":"ContainerStarted","Data":"ab53685f3fe08083ce0a7b610dba799789c0f7e981a250d3fb6e1a4cb7aba9e6"} Apr 22 15:08:59.580814 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:59.580783 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lb9c7" event={"ID":"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3","Type":"ContainerStarted","Data":"25cfe21fed182540f316b6b8934ddfb2d6111004a67da4f5437412f80ccf50c9"} Apr 22 15:08:59.580814 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:59.580812 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lb9c7" event={"ID":"e0a9849c-6b92-4aa1-b14f-9246ef0c29f3","Type":"ContainerStarted","Data":"2f84185e59701cc79435d33f299d7d948ba473c7ac0d5c8de7abc85c294fb4c1"} Apr 22 15:08:59.580978 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:59.580918 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lb9c7" Apr 22 15:08:59.617819 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:59.617756 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ndg26" podStartSLOduration=1.945852581 podStartE2EDuration="4.61774124s" podCreationTimestamp="2026-04-22 15:08:55 +0000 UTC" firstStartedPulling="2026-04-22 15:08:55.878850476 +0000 UTC m=+39.992799059" lastFinishedPulling="2026-04-22 15:08:58.550739135 +0000 UTC m=+42.664687718" observedRunningTime="2026-04-22 15:08:59.616790651 +0000 UTC m=+43.730739269" watchObservedRunningTime="2026-04-22 15:08:59.61774124 +0000 UTC m=+43.731689844" Apr 22 15:08:59.656440 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:08:59.656343 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lb9c7" podStartSLOduration=1.985843763 podStartE2EDuration="4.65632002s" podCreationTimestamp="2026-04-22 15:08:55 +0000 UTC" firstStartedPulling="2026-04-22 15:08:55.874526457 +0000 UTC m=+39.988475040" lastFinishedPulling="2026-04-22 15:08:58.545002709 +0000 UTC m=+42.658951297" observedRunningTime="2026-04-22 15:08:59.655384636 +0000 UTC m=+43.769333231" watchObservedRunningTime="2026-04-22 15:08:59.65632002 +0000 UTC m=+43.770268625" Apr 22 15:09:00.584436 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:00.584399 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xgdfl" event={"ID":"8b72d438-c2b7-4709-a0a5-3c11f2a7894e","Type":"ContainerStarted","Data":"b94b21af74a5f6b19caf1732af123e3a38d33e400cf690e8b8af1bd1042a6986"} Apr 22 15:09:01.506085 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.506044 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc"] Apr 22 15:09:01.576161 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.576102 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f64bz"] Apr 22 15:09:01.576339 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.576311 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.582521 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.582479 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 15:09:01.582757 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.582735 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 15:09:01.583226 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.583062 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 15:09:01.583226 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.583158 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 15:09:01.583226 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.583166 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-ss6qf\"" Apr 22 15:09:01.583929 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.583874 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 15:09:01.608719 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.608687 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc"] Apr 22 15:09:01.608719 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.608721 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f64bz"] Apr 22 15:09:01.609208 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.608737 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n4dfk"] Apr 22 15:09:01.609208 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.609021 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.615295 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.615272 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 15:09:01.615295 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.615289 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 15:09:01.623355 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.623332 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-v5rxd\"" Apr 22 15:09:01.623907 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.623884 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 15:09:01.633126 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.633105 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.640468 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.640446 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 15:09:01.640786 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.640720 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2fn2s\"" Apr 22 15:09:01.640965 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.640948 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 15:09:01.641048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.641022 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 15:09:01.661963 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.661907 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xgdfl" podStartSLOduration=2.182070927 podStartE2EDuration="6.661886986s" podCreationTimestamp="2026-04-22 15:08:55 +0000 UTC" firstStartedPulling="2026-04-22 15:08:55.980159874 +0000 UTC m=+40.094108460" lastFinishedPulling="2026-04-22 15:09:00.459975923 +0000 UTC m=+44.573924519" observedRunningTime="2026-04-22 15:09:01.660441762 +0000 UTC m=+45.774390367" watchObservedRunningTime="2026-04-22 15:09:01.661886986 +0000 UTC m=+45.775835591" Apr 22 15:09:01.732165 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732130 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.732165 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732170 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.732352 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732205 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg9jr\" (UniqueName: \"kubernetes.io/projected/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-kube-api-access-fg9jr\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.732352 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732225 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-tls\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.732352 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732250 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.732352 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732277 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-accelerators-collector-config\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.732352 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732299 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3bf30d0-0409-4605-ad12-51f7fa8f533a-root\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.732550 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732394 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.732550 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732422 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.732550 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732442 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f131fc27-d3d8-4975-912a-262223f2a995-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.732550 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732459 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.732550 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732519 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jttv7\" (UniqueName: \"kubernetes.io/projected/f131fc27-d3d8-4975-912a-262223f2a995-kube-api-access-jttv7\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.732719 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732607 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.732719 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732626 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3bf30d0-0409-4605-ad12-51f7fa8f533a-metrics-client-ca\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.732719 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732671 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-wtmp\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.732719 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732711 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3bf30d0-0409-4605-ad12-51f7fa8f533a-sys\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.732876 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732741 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f131fc27-d3d8-4975-912a-262223f2a995-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.732876 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732775 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-textfile\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.732876 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.732801 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n49rb\" (UniqueName: \"kubernetes.io/projected/d3bf30d0-0409-4605-ad12-51f7fa8f533a-kube-api-access-n49rb\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833373 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833284 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.833373 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833325 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3bf30d0-0409-4605-ad12-51f7fa8f533a-metrics-client-ca\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833373 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833347 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-wtmp\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833587 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833394 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3bf30d0-0409-4605-ad12-51f7fa8f533a-sys\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833587 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833449 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3bf30d0-0409-4605-ad12-51f7fa8f533a-sys\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833587 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833550 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-wtmp\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833587 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833571 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f131fc27-d3d8-4975-912a-262223f2a995-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.833748 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833594 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-textfile\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833748 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833634 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n49rb\" (UniqueName: \"kubernetes.io/projected/d3bf30d0-0409-4605-ad12-51f7fa8f533a-kube-api-access-n49rb\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833748 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833716 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.833885 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833759 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833885 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833798 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg9jr\" (UniqueName: \"kubernetes.io/projected/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-kube-api-access-fg9jr\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.833885 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833821 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-tls\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.833885 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833846 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.834075 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.833894 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-textfile\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.834159 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834134 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-accelerators-collector-config\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.834218 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834199 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3bf30d0-0409-4605-ad12-51f7fa8f533a-root\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.834297 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834274 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f131fc27-d3d8-4975-912a-262223f2a995-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.834376 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834282 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.834439 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834348 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.834439 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834419 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f131fc27-d3d8-4975-912a-262223f2a995-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.834541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834452 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.834541 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834487 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jttv7\" (UniqueName: \"kubernetes.io/projected/f131fc27-d3d8-4975-912a-262223f2a995-kube-api-access-jttv7\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.834541 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:01.834355 2606 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 15:09:01.834683 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834533 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.834683 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834565 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3bf30d0-0409-4605-ad12-51f7fa8f533a-root\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.834683 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:01.834593 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-tls podName:f131fc27-d3d8-4975-912a-262223f2a995 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:02.334567291 +0000 UTC m=+46.448515879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-f64bz" (UID: "f131fc27-d3d8-4975-912a-262223f2a995") : secret "kube-state-metrics-tls" not found Apr 22 15:09:01.834891 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.834849 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f131fc27-d3d8-4975-912a-262223f2a995-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.836488 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.836467 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-tls\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.836599 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.836527 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.836599 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.836560 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.836954 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.836937 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.843183 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.843160 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.843354 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.843330 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3bf30d0-0409-4605-ad12-51f7fa8f533a-metrics-client-ca\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.843470 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.843454 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.843552 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.843529 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d3bf30d0-0409-4605-ad12-51f7fa8f533a-node-exporter-accelerators-collector-config\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.850247 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.850225 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n49rb\" (UniqueName: \"kubernetes.io/projected/d3bf30d0-0409-4605-ad12-51f7fa8f533a-kube-api-access-n49rb\") pod \"node-exporter-n4dfk\" (UID: \"d3bf30d0-0409-4605-ad12-51f7fa8f533a\") " pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.855149 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.855102 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg9jr\" (UniqueName: \"kubernetes.io/projected/12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d-kube-api-access-fg9jr\") pod \"openshift-state-metrics-9d44df66c-8whlc\" (UID: \"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.856035 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.856009 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jttv7\" (UniqueName: \"kubernetes.io/projected/f131fc27-d3d8-4975-912a-262223f2a995-kube-api-access-jttv7\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:01.886691 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.886661 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" Apr 22 15:09:01.943487 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:01.943461 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n4dfk" Apr 22 15:09:01.998835 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:01.998805 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3bf30d0_0409_4605_ad12_51f7fa8f533a.slice/crio-47f29f32b4be35ae2ae14d6230921778c50ae0f8a52895f85de67531c382fc9f WatchSource:0}: Error finding container 47f29f32b4be35ae2ae14d6230921778c50ae0f8a52895f85de67531c382fc9f: Status 404 returned error can't find the container with id 47f29f32b4be35ae2ae14d6230921778c50ae0f8a52895f85de67531c382fc9f Apr 22 15:09:02.015977 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.015942 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc"] Apr 22 15:09:02.018900 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:02.018877 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12330f4f_a28c_4e98_afe7_f2ef0bd3cd1d.slice/crio-3e4bc5e10a6aa3b03343fdd8268762b8fe5790a0681c5a47403721745083c22d WatchSource:0}: Error finding container 3e4bc5e10a6aa3b03343fdd8268762b8fe5790a0681c5a47403721745083c22d: Status 404 returned error can't find the container with id 3e4bc5e10a6aa3b03343fdd8268762b8fe5790a0681c5a47403721745083c22d Apr 22 15:09:02.340762 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.340671 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:02.340917 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:02.340831 2606 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 15:09:02.340917 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:02.340901 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-tls podName:f131fc27-d3d8-4975-912a-262223f2a995 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:03.340885722 +0000 UTC m=+47.454834304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-f64bz" (UID: "f131fc27-d3d8-4975-912a-262223f2a995") : secret "kube-state-metrics-tls" not found Apr 22 15:09:02.592015 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.591922 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68766dc794-c4wp6" event={"ID":"a1e7115d-ddcd-45ec-90c8-5b49f1464536","Type":"ContainerStarted","Data":"5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46"} Apr 22 15:09:02.594302 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.594264 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" event={"ID":"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d","Type":"ContainerStarted","Data":"c91f545f00b58fb13237cef916c7aa8d5feb98a17fcb6ebef4499d81aa29bb87"} Apr 22 15:09:02.594302 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.594303 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" event={"ID":"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d","Type":"ContainerStarted","Data":"516bb1abd639803e09c66bdc73027c64db6832611de09ec592f8567b19f1a72c"} Apr 22 15:09:02.594729 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.594318 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" event={"ID":"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d","Type":"ContainerStarted","Data":"3e4bc5e10a6aa3b03343fdd8268762b8fe5790a0681c5a47403721745083c22d"} Apr 22 15:09:02.595916 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.595889 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n4dfk" event={"ID":"d3bf30d0-0409-4605-ad12-51f7fa8f533a","Type":"ContainerStarted","Data":"47f29f32b4be35ae2ae14d6230921778c50ae0f8a52895f85de67531c382fc9f"} Apr 22 15:09:02.613347 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.613323 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:09:02.632062 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.632029 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.635957 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.635934 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 15:09:02.636220 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.636200 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 15:09:02.636303 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.636200 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 15:09:02.636601 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.636579 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 15:09:02.636601 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.636596 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 15:09:02.636745 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.636583 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 15:09:02.636745 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.636642 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 15:09:02.636918 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.636899 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 15:09:02.636986 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.636921 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-97fgb\"" Apr 22 15:09:02.637036 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.637021 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 15:09:02.640693 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.639946 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68766dc794-c4wp6" podStartSLOduration=2.322619235 podStartE2EDuration="5.639929895s" podCreationTimestamp="2026-04-22 15:08:57 +0000 UTC" firstStartedPulling="2026-04-22 15:08:58.683225833 +0000 UTC m=+42.797174428" lastFinishedPulling="2026-04-22 15:09:02.000536491 +0000 UTC m=+46.114485088" observedRunningTime="2026-04-22 15:09:02.615750039 +0000 UTC m=+46.729698644" watchObservedRunningTime="2026-04-22 15:09:02.639929895 +0000 UTC m=+46.753878496" Apr 22 15:09:02.641538 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.641495 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:09:02.744568 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.744532 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-web-config\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.744794 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.744600 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-out\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.744794 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.744625 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.744794 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.744702 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.744927 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.744852 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.744927 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.744880 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.745019 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.744989 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.745074 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.745019 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.745074 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.745066 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.745176 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.745113 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.745176 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.745151 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxl9\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-kube-api-access-rbxl9\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.745281 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.745204 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-volume\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.745281 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.745249 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.846669 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846470 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.846669 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846527 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.846669 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846577 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.846669 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846613 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.846669 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846642 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxl9\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-kube-api-access-rbxl9\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.847020 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846686 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-volume\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.847020 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846726 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.847020 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846755 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-web-config\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.847020 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846801 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-out\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.847020 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:02.846686 2606 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 15:09:02.847020 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:02.846931 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-main-tls podName:9bd271b3-97e0-45e3-a870-19bfcbe723d4 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:03.34690777 +0000 UTC m=+47.460856367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4") : secret "alertmanager-main-tls" not found Apr 22 15:09:02.847317 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.846823 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.847317 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.847083 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.847317 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.847121 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.847317 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.847157 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.848910 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.847919 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.849162 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.849131 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.849527 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:02.849505 2606 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-trusted-ca-bundle podName:9bd271b3-97e0-45e3-a870-19bfcbe723d4 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:03.34948586 +0000 UTC m=+47.463434462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4") : configmap references non-existent config key: ca-bundle.crt Apr 22 15:09:02.850417 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.850370 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.851127 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.850707 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.851505 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.851472 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-web-config\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.852428 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.852381 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.853304 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.853271 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-out\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.853952 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.853838 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.854622 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.854141 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.855677 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.855653 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-volume\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:02.862381 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:02.862344 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxl9\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-kube-api-access-rbxl9\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:03.351257 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.351217 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:03.351475 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.351294 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:03.351475 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.351331 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:03.352502 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.352472 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:03.353778 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.353749 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f131fc27-d3d8-4975-912a-262223f2a995-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f64bz\" (UID: \"f131fc27-d3d8-4975-912a-262223f2a995\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:03.353869 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.353860 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:03.418945 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.418914 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" Apr 22 15:09:03.543916 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.543883 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:03.792238 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.792208 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:09:03.796177 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:03.796137 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd271b3_97e0_45e3_a870_19bfcbe723d4.slice/crio-81a2c423b00d41024b3e7d218800da7982551c04cd050e58c529401ee7fd89e6 WatchSource:0}: Error finding container 81a2c423b00d41024b3e7d218800da7982551c04cd050e58c529401ee7fd89e6: Status 404 returned error can't find the container with id 81a2c423b00d41024b3e7d218800da7982551c04cd050e58c529401ee7fd89e6 Apr 22 15:09:03.817854 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:03.817825 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f64bz"] Apr 22 15:09:03.822886 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:03.822842 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf131fc27_d3d8_4975_912a_262223f2a995.slice/crio-5f66a89897a04042f266fdd94b9825cb10205831ea10200e326f02ac54500a61 WatchSource:0}: Error finding container 5f66a89897a04042f266fdd94b9825cb10205831ea10200e326f02ac54500a61: Status 404 returned error can't find the container with id 5f66a89897a04042f266fdd94b9825cb10205831ea10200e326f02ac54500a61 Apr 22 15:09:04.604236 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:04.604192 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" event={"ID":"f131fc27-d3d8-4975-912a-262223f2a995","Type":"ContainerStarted","Data":"5f66a89897a04042f266fdd94b9825cb10205831ea10200e326f02ac54500a61"} Apr 22 15:09:04.605181 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:04.605152 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerStarted","Data":"81a2c423b00d41024b3e7d218800da7982551c04cd050e58c529401ee7fd89e6"} Apr 22 15:09:04.606567 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:04.606541 2606 generic.go:358] "Generic (PLEG): container finished" podID="d3bf30d0-0409-4605-ad12-51f7fa8f533a" containerID="a0f17cb38f59b9b2767f30e568bff4e7c64938699b0e183ddf53ba4723ae729e" exitCode=0 Apr 22 15:09:04.606646 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:04.606616 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n4dfk" event={"ID":"d3bf30d0-0409-4605-ad12-51f7fa8f533a","Type":"ContainerDied","Data":"a0f17cb38f59b9b2767f30e568bff4e7c64938699b0e183ddf53ba4723ae729e"} Apr 22 15:09:04.608598 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:04.608578 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" event={"ID":"12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d","Type":"ContainerStarted","Data":"b477e2fb00db0529b7c79ce88de86c1406b7180ed12c32ad9aa4f48880668416"} Apr 22 15:09:04.665340 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:04.665280 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8whlc" podStartSLOduration=2.198831294 podStartE2EDuration="3.665261994s" podCreationTimestamp="2026-04-22 15:09:01 +0000 UTC" firstStartedPulling="2026-04-22 15:09:02.185387652 +0000 UTC m=+46.299336238" lastFinishedPulling="2026-04-22 15:09:03.651818341 +0000 UTC m=+47.765766938" observedRunningTime="2026-04-22 15:09:04.664279759 +0000 UTC m=+48.778228369" watchObservedRunningTime="2026-04-22 15:09:04.665261994 +0000 UTC m=+48.779210599" Apr 22 15:09:05.612539 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:05.612504 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n4dfk" event={"ID":"d3bf30d0-0409-4605-ad12-51f7fa8f533a","Type":"ContainerStarted","Data":"38c5f91e6c844d3a1c6ab38d4db11020d0f51a4f95816d45945c0573c1a4bb6b"} Apr 22 15:09:05.612539 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:05.612542 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n4dfk" event={"ID":"d3bf30d0-0409-4605-ad12-51f7fa8f533a","Type":"ContainerStarted","Data":"f181d20875516a68549eff573098d438664a7a58d06b4ccbb5349688f3b54f35"} Apr 22 15:09:06.283393 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.283324 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n4dfk" podStartSLOduration=3.6361732289999997 podStartE2EDuration="5.283306195s" podCreationTimestamp="2026-04-22 15:09:01 +0000 UTC" firstStartedPulling="2026-04-22 15:09:02.000475474 +0000 UTC m=+46.114424056" lastFinishedPulling="2026-04-22 15:09:03.64760842 +0000 UTC m=+47.761557022" observedRunningTime="2026-04-22 15:09:05.633700343 +0000 UTC m=+49.747649071" watchObservedRunningTime="2026-04-22 15:09:06.283306195 +0000 UTC m=+50.397254848" Apr 22 15:09:06.283727 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.283707 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85d75445b4-8fs45"] Apr 22 15:09:06.310115 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.310067 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d75445b4-8fs45"] Apr 22 15:09:06.310385 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.310228 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.319945 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.319918 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 15:09:06.376099 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.376067 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-oauth-config\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.376274 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.376121 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-oauth-serving-cert\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.376274 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.376183 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-trusted-ca-bundle\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.376274 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.376243 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-serving-cert\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.376430 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.376273 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-console-config\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.376430 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.376350 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-service-ca\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.376430 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.376405 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g29qh\" (UniqueName: \"kubernetes.io/projected/0081f494-103e-486a-9b26-dd053bcd4b4b-kube-api-access-g29qh\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.477562 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.477531 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-oauth-config\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.477730 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.477578 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-oauth-serving-cert\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.477769 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.477724 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-trusted-ca-bundle\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.477819 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.477770 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-serving-cert\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.477819 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.477796 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-console-config\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.477906 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.477877 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-service-ca\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.477954 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.477906 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g29qh\" (UniqueName: \"kubernetes.io/projected/0081f494-103e-486a-9b26-dd053bcd4b4b-kube-api-access-g29qh\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.478378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.478332 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-oauth-serving-cert\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.478590 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.478566 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-service-ca\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.478668 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.478614 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-trusted-ca-bundle\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.478881 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.478861 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-console-config\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.480173 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.480152 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-serving-cert\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.480275 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.480206 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-oauth-config\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.487511 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.487486 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g29qh\" (UniqueName: \"kubernetes.io/projected/0081f494-103e-486a-9b26-dd053bcd4b4b-kube-api-access-g29qh\") pod \"console-85d75445b4-8fs45\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.617318 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.617228 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" event={"ID":"f131fc27-d3d8-4975-912a-262223f2a995","Type":"ContainerStarted","Data":"270312ab319347940991fcaf3b6f5e2a8ac4a82442d1c1bea3c7ca945f741fca"} Apr 22 15:09:06.617318 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.617266 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" event={"ID":"f131fc27-d3d8-4975-912a-262223f2a995","Type":"ContainerStarted","Data":"5119f80439752dda5206a3c14853b4ac8c92319bbe9a67db0eb9f65a01e38112"} Apr 22 15:09:06.617318 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.617277 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" event={"ID":"f131fc27-d3d8-4975-912a-262223f2a995","Type":"ContainerStarted","Data":"78f19eadea699c7f16bfa3f31ad89f20038800f59fd772e7154eabfae0fbdac8"} Apr 22 15:09:06.618691 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.618671 2606 generic.go:358] "Generic (PLEG): container finished" podID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerID="3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade" exitCode=0 Apr 22 15:09:06.618781 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.618749 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerDied","Data":"3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade"} Apr 22 15:09:06.619938 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.619921 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:06.651355 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.651302 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-f64bz" podStartSLOduration=3.664427716 podStartE2EDuration="5.651285559s" podCreationTimestamp="2026-04-22 15:09:01 +0000 UTC" firstStartedPulling="2026-04-22 15:09:03.824839306 +0000 UTC m=+47.938787888" lastFinishedPulling="2026-04-22 15:09:05.811697144 +0000 UTC m=+49.925645731" observedRunningTime="2026-04-22 15:09:06.649917462 +0000 UTC m=+50.763866069" watchObservedRunningTime="2026-04-22 15:09:06.651285559 +0000 UTC m=+50.765234162" Apr 22 15:09:06.757767 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.757738 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d75445b4-8fs45"] Apr 22 15:09:06.759689 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:06.759659 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0081f494_103e_486a_9b26_dd053bcd4b4b.slice/crio-0252e0657fe339ba804a4e42605bf57ed049fc07363e51996a51c6f0ca196428 WatchSource:0}: Error finding container 0252e0657fe339ba804a4e42605bf57ed049fc07363e51996a51c6f0ca196428: Status 404 returned error can't find the container with id 0252e0657fe339ba804a4e42605bf57ed049fc07363e51996a51c6f0ca196428 Apr 22 15:09:06.870148 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.870093 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f"] Apr 22 15:09:06.876856 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.876835 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:06.883591 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.883567 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 15:09:06.883591 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.883585 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 15:09:06.883774 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.883597 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 15:09:06.883774 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.883685 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-jw2f2\"" Apr 22 15:09:06.883893 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.883877 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 15:09:06.883970 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.883952 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 15:09:06.894242 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.894215 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 15:09:06.896824 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.896797 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f"] Apr 22 15:09:06.984628 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.984581 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af64188f-395c-4e52-a211-cf0173131bd3-metrics-client-ca\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:06.984791 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.984641 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfkv6\" (UniqueName: \"kubernetes.io/projected/af64188f-395c-4e52-a211-cf0173131bd3-kube-api-access-jfkv6\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:06.984791 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.984699 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-secret-telemeter-client\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:06.984791 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.984776 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af64188f-395c-4e52-a211-cf0173131bd3-serving-certs-ca-bundle\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:06.984905 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.984818 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:06.984905 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.984839 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af64188f-395c-4e52-a211-cf0173131bd3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:06.984905 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.984861 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-federate-client-tls\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:06.984905 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:06.984896 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-telemeter-client-tls\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.085511 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.085477 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af64188f-395c-4e52-a211-cf0173131bd3-serving-certs-ca-bundle\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.085687 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.085524 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.085687 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.085655 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af64188f-395c-4e52-a211-cf0173131bd3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.085805 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.085718 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-federate-client-tls\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.085805 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.085774 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-telemeter-client-tls\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.085915 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.085883 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af64188f-395c-4e52-a211-cf0173131bd3-metrics-client-ca\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.085968 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.085916 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfkv6\" (UniqueName: \"kubernetes.io/projected/af64188f-395c-4e52-a211-cf0173131bd3-kube-api-access-jfkv6\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.085968 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.085949 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-secret-telemeter-client\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.086301 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.086269 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af64188f-395c-4e52-a211-cf0173131bd3-serving-certs-ca-bundle\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.086607 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.086579 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af64188f-395c-4e52-a211-cf0173131bd3-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.087144 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.087114 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/af64188f-395c-4e52-a211-cf0173131bd3-metrics-client-ca\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.089008 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.088983 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-secret-telemeter-client\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.089117 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.089078 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.089183 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.089124 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-telemeter-client-tls\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.089183 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.089170 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/af64188f-395c-4e52-a211-cf0173131bd3-federate-client-tls\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.101170 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.101143 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfkv6\" (UniqueName: \"kubernetes.io/projected/af64188f-395c-4e52-a211-cf0173131bd3-kube-api-access-jfkv6\") pod \"telemeter-client-bc59b9f4d-7ch2f\" (UID: \"af64188f-395c-4e52-a211-cf0173131bd3\") " pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.185206 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.185118 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" Apr 22 15:09:07.337519 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.337484 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f"] Apr 22 15:09:07.341960 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:07.341930 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf64188f_395c_4e52_a211_cf0173131bd3.slice/crio-56504e2a1eeeafe998fcc896335310b4a0808dfd54c32f5e2d057e9b2e96ef02 WatchSource:0}: Error finding container 56504e2a1eeeafe998fcc896335310b4a0808dfd54c32f5e2d057e9b2e96ef02: Status 404 returned error can't find the container with id 56504e2a1eeeafe998fcc896335310b4a0808dfd54c32f5e2d057e9b2e96ef02 Apr 22 15:09:07.623836 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.623789 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" event={"ID":"af64188f-395c-4e52-a211-cf0173131bd3","Type":"ContainerStarted","Data":"56504e2a1eeeafe998fcc896335310b4a0808dfd54c32f5e2d057e9b2e96ef02"} Apr 22 15:09:07.625522 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.625490 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d75445b4-8fs45" event={"ID":"0081f494-103e-486a-9b26-dd053bcd4b4b","Type":"ContainerStarted","Data":"42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e"} Apr 22 15:09:07.625655 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.625528 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d75445b4-8fs45" event={"ID":"0081f494-103e-486a-9b26-dd053bcd4b4b","Type":"ContainerStarted","Data":"0252e0657fe339ba804a4e42605bf57ed049fc07363e51996a51c6f0ca196428"} Apr 22 15:09:07.670401 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.670331 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85d75445b4-8fs45" podStartSLOduration=1.670315409 podStartE2EDuration="1.670315409s" podCreationTimestamp="2026-04-22 15:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:09:07.658714767 +0000 UTC m=+51.772663393" watchObservedRunningTime="2026-04-22 15:09:07.670315409 +0000 UTC m=+51.784264011" Apr 22 15:09:07.802162 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.802123 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:09:07.802344 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.802180 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:09:07.808352 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:07.808325 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:09:08.079156 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.079129 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:09:08.083435 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.083409 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.093521 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.093413 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 15:09:08.093650 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.093620 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-a34g3geebnt8t\"" Apr 22 15:09:08.093749 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.093650 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 15:09:08.093749 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.093680 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 15:09:08.093944 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.093900 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 15:09:08.094146 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.094130 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 15:09:08.094233 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.094170 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 15:09:08.094233 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.094215 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-r5mdm\"" Apr 22 15:09:08.094337 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.094232 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 15:09:08.095321 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.095303 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 15:09:08.095445 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.095427 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 15:09:08.096451 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096431 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096527 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096474 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-web-config\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096527 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096496 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096527 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096511 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096696 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096536 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096696 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096553 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096696 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096588 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd9vg\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-kube-api-access-jd9vg\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096696 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096608 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096696 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096624 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096696 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096661 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.096696 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096679 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.097224 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.096702 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.097331 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.097281 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config-out\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.097331 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.097324 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.097597 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.097393 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.097991 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.097425 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.097991 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.097728 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.097991 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.097793 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.100716 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.099774 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 15:09:08.100716 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.100114 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 15:09:08.100716 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.100486 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 15:09:08.108127 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.108107 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 15:09:08.110341 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.110324 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68766dc794-c4wp6"] Apr 22 15:09:08.124913 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.124861 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:09:08.198618 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.198593 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.198730 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.198639 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config-out\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.198788 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.198743 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.198788 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.198781 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.198887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.198809 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.198887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.198868 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.198974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.198924 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.199154 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199122 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199380 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-web-config\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199427 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199456 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199492 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199519 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199581 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd9vg\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-kube-api-access-jd9vg\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199610 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199639 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199699 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.199726 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.200049 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.200146 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.200830 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.201405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.201185 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.203075 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.203050 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.204140 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.203527 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.204140 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.203691 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.204140 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.203800 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.204852 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.204825 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.212333 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.212268 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.212637 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.212551 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-web-config\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.212637 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.212591 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.212835 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.212805 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.212910 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.212889 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.213178 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.213155 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.213242 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.213208 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.213441 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.213385 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config-out\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.233233 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.233209 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd9vg\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-kube-api-access-jd9vg\") pod \"prometheus-k8s-0\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.421707 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.421624 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:08.565120 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.565085 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:09:08.569312 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:08.569273 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e3dc13_4457_4fde_a62b_95e44b7d12eb.slice/crio-cadc1bf2f0687623db7c339e5fcdb265633cbffe357175f63c2620a01fea3974 WatchSource:0}: Error finding container cadc1bf2f0687623db7c339e5fcdb265633cbffe357175f63c2620a01fea3974: Status 404 returned error can't find the container with id cadc1bf2f0687623db7c339e5fcdb265633cbffe357175f63c2620a01fea3974 Apr 22 15:09:08.630494 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.630438 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerStarted","Data":"cadc1bf2f0687623db7c339e5fcdb265633cbffe357175f63c2620a01fea3974"} Apr 22 15:09:08.633780 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.633671 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerStarted","Data":"c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57"} Apr 22 15:09:08.633780 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.633720 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerStarted","Data":"bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d"} Apr 22 15:09:08.633780 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.633736 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerStarted","Data":"6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a"} Apr 22 15:09:08.633780 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.633747 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerStarted","Data":"19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5"} Apr 22 15:09:08.633780 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.633760 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerStarted","Data":"f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950"} Apr 22 15:09:08.638654 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:08.638616 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:09:09.586611 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:09.586588 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lb9c7" Apr 22 15:09:09.640276 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:09.640236 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerStarted","Data":"6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f"} Apr 22 15:09:09.642377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:09.642332 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" event={"ID":"af64188f-395c-4e52-a211-cf0173131bd3","Type":"ContainerStarted","Data":"d2e3f47baea33d20149abe316d0273d4491e1b90d9231d8a1fc1ea3e494f3c8f"} Apr 22 15:09:09.642500 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:09.642381 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" event={"ID":"af64188f-395c-4e52-a211-cf0173131bd3","Type":"ContainerStarted","Data":"709c2ec1c4195c4a4b3c81484f5faefb633dccc61edcd5f45240e7ddcc83401c"} Apr 22 15:09:09.644227 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:09.644040 2606 generic.go:358] "Generic (PLEG): container finished" podID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerID="e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3" exitCode=0 Apr 22 15:09:09.644227 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:09.644126 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerDied","Data":"e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3"} Apr 22 15:09:09.672173 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:09.672125 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.598504769 podStartE2EDuration="7.672104996s" podCreationTimestamp="2026-04-22 15:09:02 +0000 UTC" firstStartedPulling="2026-04-22 15:09:03.79856804 +0000 UTC m=+47.912516622" lastFinishedPulling="2026-04-22 15:09:08.872168264 +0000 UTC m=+52.986116849" observedRunningTime="2026-04-22 15:09:09.669707422 +0000 UTC m=+53.783656029" watchObservedRunningTime="2026-04-22 15:09:09.672104996 +0000 UTC m=+53.786053608" Apr 22 15:09:10.649084 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:10.649037 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" event={"ID":"af64188f-395c-4e52-a211-cf0173131bd3","Type":"ContainerStarted","Data":"9ee21ecaed6d0c059f3af798d88f9b9d7d8589d4bd28a63235b5725e81a667ad"} Apr 22 15:09:10.679195 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:10.679138 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-bc59b9f4d-7ch2f" podStartSLOduration=2.518129545 podStartE2EDuration="4.679125211s" podCreationTimestamp="2026-04-22 15:09:06 +0000 UTC" firstStartedPulling="2026-04-22 15:09:07.344538566 +0000 UTC m=+51.458487162" lastFinishedPulling="2026-04-22 15:09:09.505534245 +0000 UTC m=+53.619482828" observedRunningTime="2026-04-22 15:09:10.677300993 +0000 UTC m=+54.791249599" watchObservedRunningTime="2026-04-22 15:09:10.679125211 +0000 UTC m=+54.793073813" Apr 22 15:09:12.292367 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:12.292340 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85d75445b4-8fs45"] Apr 22 15:09:13.662310 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:13.662278 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerStarted","Data":"a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4"} Apr 22 15:09:13.662310 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:13.662314 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerStarted","Data":"8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8"} Apr 22 15:09:15.546166 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:15.546138 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ck8h2" Apr 22 15:09:15.675581 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:15.675504 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerStarted","Data":"1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f"} Apr 22 15:09:15.675581 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:15.675540 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerStarted","Data":"4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085"} Apr 22 15:09:15.675581 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:15.675551 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerStarted","Data":"a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d"} Apr 22 15:09:15.675581 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:15.675561 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerStarted","Data":"69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619"} Apr 22 15:09:15.704838 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:15.704790 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.9286144219999999 podStartE2EDuration="7.704763795s" podCreationTimestamp="2026-04-22 15:09:08 +0000 UTC" firstStartedPulling="2026-04-22 15:09:09.645769126 +0000 UTC m=+53.759717707" lastFinishedPulling="2026-04-22 15:09:15.421918495 +0000 UTC m=+59.535867080" observedRunningTime="2026-04-22 15:09:15.703140208 +0000 UTC m=+59.817088813" watchObservedRunningTime="2026-04-22 15:09:15.704763795 +0000 UTC m=+59.818712400" Apr 22 15:09:15.743438 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:15.743394 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:15.761921 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:15.761897 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:16.620354 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:16.620319 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:16.678275 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:16.678235 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:16.692843 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:16.692810 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:22.125966 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.125928 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:09:22.128996 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.128964 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:09:22.139015 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.138986 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7069128e-a7fb-43e9-a858-e8e3250b2ac0-metrics-certs\") pod \"network-metrics-daemon-75v74\" (UID: \"7069128e-a7fb-43e9-a858-e8e3250b2ac0\") " pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:09:22.227125 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.227090 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:09:22.229864 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.229844 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:09:22.240651 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.240620 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281-original-pull-secret\") pod \"global-pull-secret-syncer-qrz9r\" (UID: \"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281\") " pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:09:22.248823 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.248806 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qrz9r" Apr 22 15:09:22.327952 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.327917 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:09:22.330800 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.330777 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:09:22.341441 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.341414 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:09:22.352855 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.352820 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wn8r\" (UniqueName: \"kubernetes.io/projected/9331fcba-cdee-486e-b00b-7bb28c810ab9-kube-api-access-4wn8r\") pod \"network-check-target-w6z28\" (UID: \"9331fcba-cdee-486e-b00b-7bb28c810ab9\") " pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:09:22.372164 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.372138 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qrz9r"] Apr 22 15:09:22.375168 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:22.375133 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d0f53d5_b964_4a2e_a21f_f5f4d2a4e281.slice/crio-8adbe02e01916d89e9cb58912668310677efc03d74760b8fc4b5beaf05d022d5 WatchSource:0}: Error finding container 8adbe02e01916d89e9cb58912668310677efc03d74760b8fc4b5beaf05d022d5: Status 404 returned error can't find the container with id 8adbe02e01916d89e9cb58912668310677efc03d74760b8fc4b5beaf05d022d5 Apr 22 15:09:22.422115 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.422085 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pdgcv\"" Apr 22 15:09:22.430278 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.430263 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-75v74" Apr 22 15:09:22.545388 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.545348 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lqzj2\"" Apr 22 15:09:22.550202 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.550175 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-75v74"] Apr 22 15:09:22.552841 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.552811 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:09:22.552943 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:22.552920 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7069128e_a7fb_43e9_a858_e8e3250b2ac0.slice/crio-30d6a42c144836c7370ed03fcaa3b713ad0fbf310bc71fe8a8e2c423a73ff2bb WatchSource:0}: Error finding container 30d6a42c144836c7370ed03fcaa3b713ad0fbf310bc71fe8a8e2c423a73ff2bb: Status 404 returned error can't find the container with id 30d6a42c144836c7370ed03fcaa3b713ad0fbf310bc71fe8a8e2c423a73ff2bb Apr 22 15:09:22.671433 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.671347 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w6z28"] Apr 22 15:09:22.674581 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:22.674559 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9331fcba_cdee_486e_b00b_7bb28c810ab9.slice/crio-bf4a9e47f66644f817267b32b8b676116ab130f43030a41d10a50e461bd805e9 WatchSource:0}: Error finding container bf4a9e47f66644f817267b32b8b676116ab130f43030a41d10a50e461bd805e9: Status 404 returned error can't find the container with id bf4a9e47f66644f817267b32b8b676116ab130f43030a41d10a50e461bd805e9 Apr 22 15:09:22.698414 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.698381 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w6z28" event={"ID":"9331fcba-cdee-486e-b00b-7bb28c810ab9","Type":"ContainerStarted","Data":"bf4a9e47f66644f817267b32b8b676116ab130f43030a41d10a50e461bd805e9"} Apr 22 15:09:22.699555 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.699515 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-75v74" event={"ID":"7069128e-a7fb-43e9-a858-e8e3250b2ac0","Type":"ContainerStarted","Data":"30d6a42c144836c7370ed03fcaa3b713ad0fbf310bc71fe8a8e2c423a73ff2bb"} Apr 22 15:09:22.700676 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:22.700652 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qrz9r" event={"ID":"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281","Type":"ContainerStarted","Data":"8adbe02e01916d89e9cb58912668310677efc03d74760b8fc4b5beaf05d022d5"} Apr 22 15:09:24.710658 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:24.710612 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-75v74" event={"ID":"7069128e-a7fb-43e9-a858-e8e3250b2ac0","Type":"ContainerStarted","Data":"ffe12b938781ecd532d6b2291e6893ac1cb8541f84130ce052d63f0a2d5d633b"} Apr 22 15:09:24.710658 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:24.710663 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-75v74" event={"ID":"7069128e-a7fb-43e9-a858-e8e3250b2ac0","Type":"ContainerStarted","Data":"f986971b620e406e88669ce35610fd9c852d979f9b4b52bef0df5da6fd2828c3"} Apr 22 15:09:24.730615 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:24.730524 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-75v74" podStartSLOduration=67.639403071 podStartE2EDuration="1m8.730503515s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:09:22.55708568 +0000 UTC m=+66.671034262" lastFinishedPulling="2026-04-22 15:09:23.648186103 +0000 UTC m=+67.762134706" observedRunningTime="2026-04-22 15:09:24.728815066 +0000 UTC m=+68.842763672" watchObservedRunningTime="2026-04-22 15:09:24.730503515 +0000 UTC m=+68.844452119" Apr 22 15:09:27.722353 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:27.722320 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w6z28" event={"ID":"9331fcba-cdee-486e-b00b-7bb28c810ab9","Type":"ContainerStarted","Data":"aada6de3afe515c4bcd0a0961dc5c80c15144a9909d764a08eff83e95eb6cc4c"} Apr 22 15:09:27.722710 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:27.722392 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:09:27.739720 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:27.739661 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-w6z28" podStartSLOduration=66.868426335 podStartE2EDuration="1m11.73964187s" podCreationTimestamp="2026-04-22 15:08:16 +0000 UTC" firstStartedPulling="2026-04-22 15:09:22.676383787 +0000 UTC m=+66.790332369" lastFinishedPulling="2026-04-22 15:09:27.547599306 +0000 UTC m=+71.661547904" observedRunningTime="2026-04-22 15:09:27.738132938 +0000 UTC m=+71.852081544" watchObservedRunningTime="2026-04-22 15:09:27.73964187 +0000 UTC m=+71.853590475" Apr 22 15:09:28.728076 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:28.728034 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qrz9r" event={"ID":"7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281","Type":"ContainerStarted","Data":"9d44aa1f12d69e77d4c156e2e15b08b8d7c70da42c25cd9b10109558f850e9eb"} Apr 22 15:09:34.666354 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:34.666311 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68766dc794-c4wp6" podUID="a1e7115d-ddcd-45ec-90c8-5b49f1464536" containerName="console" containerID="cri-o://5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46" gracePeriod=15 Apr 22 15:09:34.921076 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:34.921023 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68766dc794-c4wp6_a1e7115d-ddcd-45ec-90c8-5b49f1464536/console/0.log" Apr 22 15:09:34.921174 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:34.921095 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:09:34.940314 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:34.940261 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qrz9r" podStartSLOduration=72.759663815 podStartE2EDuration="1m17.940243401s" podCreationTimestamp="2026-04-22 15:08:17 +0000 UTC" firstStartedPulling="2026-04-22 15:09:22.376924092 +0000 UTC m=+66.490872673" lastFinishedPulling="2026-04-22 15:09:27.557503666 +0000 UTC m=+71.671452259" observedRunningTime="2026-04-22 15:09:28.742826275 +0000 UTC m=+72.856774881" watchObservedRunningTime="2026-04-22 15:09:34.940243401 +0000 UTC m=+79.054192005" Apr 22 15:09:35.049195 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.049163 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-serving-cert\") pod \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " Apr 22 15:09:35.049377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.049228 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-service-ca\") pod \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " Apr 22 15:09:35.049377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.049277 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74nnz\" (UniqueName: \"kubernetes.io/projected/a1e7115d-ddcd-45ec-90c8-5b49f1464536-kube-api-access-74nnz\") pod \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " Apr 22 15:09:35.049377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.049294 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-oauth-serving-cert\") pod \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " Apr 22 15:09:35.049377 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.049321 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-oauth-config\") pod \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " Apr 22 15:09:35.049605 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.049380 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-config\") pod \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\" (UID: \"a1e7115d-ddcd-45ec-90c8-5b49f1464536\") " Apr 22 15:09:35.049662 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.049617 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-service-ca" (OuterVolumeSpecName: "service-ca") pod "a1e7115d-ddcd-45ec-90c8-5b49f1464536" (UID: "a1e7115d-ddcd-45ec-90c8-5b49f1464536"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:35.049882 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.049850 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a1e7115d-ddcd-45ec-90c8-5b49f1464536" (UID: "a1e7115d-ddcd-45ec-90c8-5b49f1464536"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:35.050059 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.050033 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-config" (OuterVolumeSpecName: "console-config") pod "a1e7115d-ddcd-45ec-90c8-5b49f1464536" (UID: "a1e7115d-ddcd-45ec-90c8-5b49f1464536"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:35.051691 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.051663 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a1e7115d-ddcd-45ec-90c8-5b49f1464536" (UID: "a1e7115d-ddcd-45ec-90c8-5b49f1464536"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:35.051791 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.051690 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e7115d-ddcd-45ec-90c8-5b49f1464536-kube-api-access-74nnz" (OuterVolumeSpecName: "kube-api-access-74nnz") pod "a1e7115d-ddcd-45ec-90c8-5b49f1464536" (UID: "a1e7115d-ddcd-45ec-90c8-5b49f1464536"). InnerVolumeSpecName "kube-api-access-74nnz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:09:35.051833 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.051786 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a1e7115d-ddcd-45ec-90c8-5b49f1464536" (UID: "a1e7115d-ddcd-45ec-90c8-5b49f1464536"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:35.150398 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.150265 2606 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-74nnz\" (UniqueName: \"kubernetes.io/projected/a1e7115d-ddcd-45ec-90c8-5b49f1464536-kube-api-access-74nnz\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:35.150398 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.150298 2606 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-oauth-serving-cert\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:35.150398 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.150310 2606 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-oauth-config\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:35.150398 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.150321 2606 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-config\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:35.150398 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.150330 2606 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1e7115d-ddcd-45ec-90c8-5b49f1464536-console-serving-cert\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:35.150398 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.150340 2606 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1e7115d-ddcd-45ec-90c8-5b49f1464536-service-ca\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:35.748884 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.748859 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68766dc794-c4wp6_a1e7115d-ddcd-45ec-90c8-5b49f1464536/console/0.log" Apr 22 15:09:35.749335 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.748899 2606 generic.go:358] "Generic (PLEG): container finished" podID="a1e7115d-ddcd-45ec-90c8-5b49f1464536" containerID="5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46" exitCode=2 Apr 22 15:09:35.749335 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.748938 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68766dc794-c4wp6" event={"ID":"a1e7115d-ddcd-45ec-90c8-5b49f1464536","Type":"ContainerDied","Data":"5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46"} Apr 22 15:09:35.749335 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.748971 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68766dc794-c4wp6" Apr 22 15:09:35.749335 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.748988 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68766dc794-c4wp6" event={"ID":"a1e7115d-ddcd-45ec-90c8-5b49f1464536","Type":"ContainerDied","Data":"751b24fb463bf40ee8196eee4a7fbdedd8e0fdc78cbd51d758f7c3090fef42ba"} Apr 22 15:09:35.749335 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.749013 2606 scope.go:117] "RemoveContainer" containerID="5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46" Apr 22 15:09:35.762189 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.762169 2606 scope.go:117] "RemoveContainer" containerID="5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46" Apr 22 15:09:35.762482 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:35.762460 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46\": container with ID starting with 5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46 not found: ID does not exist" containerID="5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46" Apr 22 15:09:35.762574 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.762495 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46"} err="failed to get container status \"5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46\": rpc error: code = NotFound desc = could not find container \"5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46\": container with ID starting with 5e59b302b855b308e4cbdaca732723d3e7f7e7323b974f8b5cfa923c933dca46 not found: ID does not exist" Apr 22 15:09:35.772202 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.772176 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68766dc794-c4wp6"] Apr 22 15:09:35.775259 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:35.775223 2606 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68766dc794-c4wp6"] Apr 22 15:09:36.416273 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:36.416241 2606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1e7115d-ddcd-45ec-90c8-5b49f1464536" path="/var/lib/kubelet/pods/a1e7115d-ddcd-45ec-90c8-5b49f1464536/volumes" Apr 22 15:09:37.312299 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.312253 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85d75445b4-8fs45" podUID="0081f494-103e-486a-9b26-dd053bcd4b4b" containerName="console" containerID="cri-o://42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e" gracePeriod=15 Apr 22 15:09:37.553285 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.553265 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85d75445b4-8fs45_0081f494-103e-486a-9b26-dd053bcd4b4b/console/0.log" Apr 22 15:09:37.553420 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.553323 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:37.672713 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.672626 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g29qh\" (UniqueName: \"kubernetes.io/projected/0081f494-103e-486a-9b26-dd053bcd4b4b-kube-api-access-g29qh\") pod \"0081f494-103e-486a-9b26-dd053bcd4b4b\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " Apr 22 15:09:37.672713 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.672669 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-console-config\") pod \"0081f494-103e-486a-9b26-dd053bcd4b4b\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " Apr 22 15:09:37.672713 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.672702 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-serving-cert\") pod \"0081f494-103e-486a-9b26-dd053bcd4b4b\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " Apr 22 15:09:37.672997 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.672749 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-oauth-config\") pod \"0081f494-103e-486a-9b26-dd053bcd4b4b\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " Apr 22 15:09:37.672997 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.672765 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-oauth-serving-cert\") pod \"0081f494-103e-486a-9b26-dd053bcd4b4b\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " Apr 22 15:09:37.672997 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.672893 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-trusted-ca-bundle\") pod \"0081f494-103e-486a-9b26-dd053bcd4b4b\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " Apr 22 15:09:37.672997 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.672963 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-service-ca\") pod \"0081f494-103e-486a-9b26-dd053bcd4b4b\" (UID: \"0081f494-103e-486a-9b26-dd053bcd4b4b\") " Apr 22 15:09:37.673202 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.673165 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0081f494-103e-486a-9b26-dd053bcd4b4b" (UID: "0081f494-103e-486a-9b26-dd053bcd4b4b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:37.673202 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.673177 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-console-config" (OuterVolumeSpecName: "console-config") pod "0081f494-103e-486a-9b26-dd053bcd4b4b" (UID: "0081f494-103e-486a-9b26-dd053bcd4b4b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:37.673309 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.673250 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0081f494-103e-486a-9b26-dd053bcd4b4b" (UID: "0081f494-103e-486a-9b26-dd053bcd4b4b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:37.673309 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.673268 2606 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-console-config\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:37.673309 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.673286 2606 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-oauth-serving-cert\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:37.673504 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.673490 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-service-ca" (OuterVolumeSpecName: "service-ca") pod "0081f494-103e-486a-9b26-dd053bcd4b4b" (UID: "0081f494-103e-486a-9b26-dd053bcd4b4b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:37.675133 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.675112 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0081f494-103e-486a-9b26-dd053bcd4b4b" (UID: "0081f494-103e-486a-9b26-dd053bcd4b4b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:37.675512 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.675475 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0081f494-103e-486a-9b26-dd053bcd4b4b" (UID: "0081f494-103e-486a-9b26-dd053bcd4b4b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:37.675609 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.675509 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0081f494-103e-486a-9b26-dd053bcd4b4b-kube-api-access-g29qh" (OuterVolumeSpecName: "kube-api-access-g29qh") pod "0081f494-103e-486a-9b26-dd053bcd4b4b" (UID: "0081f494-103e-486a-9b26-dd053bcd4b4b"). InnerVolumeSpecName "kube-api-access-g29qh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:09:37.758047 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.758020 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85d75445b4-8fs45_0081f494-103e-486a-9b26-dd053bcd4b4b/console/0.log" Apr 22 15:09:37.758237 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.758060 2606 generic.go:358] "Generic (PLEG): container finished" podID="0081f494-103e-486a-9b26-dd053bcd4b4b" containerID="42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e" exitCode=2 Apr 22 15:09:37.758237 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.758142 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d75445b4-8fs45" Apr 22 15:09:37.758237 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.758148 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d75445b4-8fs45" event={"ID":"0081f494-103e-486a-9b26-dd053bcd4b4b","Type":"ContainerDied","Data":"42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e"} Apr 22 15:09:37.758237 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.758181 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d75445b4-8fs45" event={"ID":"0081f494-103e-486a-9b26-dd053bcd4b4b","Type":"ContainerDied","Data":"0252e0657fe339ba804a4e42605bf57ed049fc07363e51996a51c6f0ca196428"} Apr 22 15:09:37.758237 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.758198 2606 scope.go:117] "RemoveContainer" containerID="42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e" Apr 22 15:09:37.766935 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.766889 2606 scope.go:117] "RemoveContainer" containerID="42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e" Apr 22 15:09:37.770833 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:37.770289 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e\": container with ID starting with 42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e not found: ID does not exist" containerID="42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e" Apr 22 15:09:37.770833 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.770336 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e"} err="failed to get container status \"42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e\": rpc error: code = NotFound desc = could not find container \"42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e\": container with ID starting with 42e5815688aaf9c4d4c2058359e3ae54230bd0ccee4d264e639e7824f444832e not found: ID does not exist" Apr 22 15:09:37.774592 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.774568 2606 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-trusted-ca-bundle\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:37.774592 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.774590 2606 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0081f494-103e-486a-9b26-dd053bcd4b4b-service-ca\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:37.774770 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.774599 2606 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g29qh\" (UniqueName: \"kubernetes.io/projected/0081f494-103e-486a-9b26-dd053bcd4b4b-kube-api-access-g29qh\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:37.774770 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.774609 2606 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-serving-cert\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:37.774770 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.774618 2606 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0081f494-103e-486a-9b26-dd053bcd4b4b-console-oauth-config\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:37.785048 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.785015 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85d75445b4-8fs45"] Apr 22 15:09:37.787381 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:37.787341 2606 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85d75445b4-8fs45"] Apr 22 15:09:38.416060 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:38.416022 2606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0081f494-103e-486a-9b26-dd053bcd4b4b" path="/var/lib/kubelet/pods/0081f494-103e-486a-9b26-dd053bcd4b4b/volumes" Apr 22 15:09:51.444753 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.444716 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:09:51.445245 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.445198 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="alertmanager" containerID="cri-o://f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950" gracePeriod=120 Apr 22 15:09:51.445317 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.445259 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy-metric" containerID="cri-o://c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57" gracePeriod=120 Apr 22 15:09:51.445394 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.445287 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy-web" containerID="cri-o://6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a" gracePeriod=120 Apr 22 15:09:51.445394 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.445304 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy" containerID="cri-o://bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d" gracePeriod=120 Apr 22 15:09:51.445560 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.445328 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="config-reloader" containerID="cri-o://19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5" gracePeriod=120 Apr 22 15:09:51.445560 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.445400 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="prom-label-proxy" containerID="cri-o://6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f" gracePeriod=120 Apr 22 15:09:51.802079 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.802047 2606 generic.go:358] "Generic (PLEG): container finished" podID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerID="6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f" exitCode=0 Apr 22 15:09:51.802079 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.802071 2606 generic.go:358] "Generic (PLEG): container finished" podID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerID="bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d" exitCode=0 Apr 22 15:09:51.802079 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.802077 2606 generic.go:358] "Generic (PLEG): container finished" podID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerID="19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5" exitCode=0 Apr 22 15:09:51.802079 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.802084 2606 generic.go:358] "Generic (PLEG): container finished" podID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerID="f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950" exitCode=0 Apr 22 15:09:51.802376 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.802122 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerDied","Data":"6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f"} Apr 22 15:09:51.802376 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.802162 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerDied","Data":"bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d"} Apr 22 15:09:51.802376 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.802178 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerDied","Data":"19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5"} Apr 22 15:09:51.802376 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:51.802191 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerDied","Data":"f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950"} Apr 22 15:09:52.693465 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.693440 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.791206 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791106 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-cluster-tls-config\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791206 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791151 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-web-config\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791206 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791191 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-metrics-client-ca\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791524 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791226 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791524 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791262 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-volume\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791524 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791313 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-out\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791762 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791736 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbxl9\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-kube-api-access-rbxl9\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791839 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791799 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-web\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791839 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791801 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:52.791839 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791826 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-tls-assets\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791994 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791888 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-main-db\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791994 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791919 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-trusted-ca-bundle\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791994 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791953 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.791994 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.791986 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-main-tls\") pod \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\" (UID: \"9bd271b3-97e0-45e3-a870-19bfcbe723d4\") " Apr 22 15:09:52.792977 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.792385 2606 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-metrics-client-ca\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.794324 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.794101 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:09:52.794324 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.794108 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:52.796609 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.796535 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:52.796609 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.796558 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:09:52.796609 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.796570 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-out" (OuterVolumeSpecName: "config-out") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:09:52.796832 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.796627 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-volume" (OuterVolumeSpecName: "config-volume") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:52.796832 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.796750 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-kube-api-access-rbxl9" (OuterVolumeSpecName: "kube-api-access-rbxl9") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "kube-api-access-rbxl9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:09:52.797042 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.797015 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:52.797209 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.797180 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:52.797472 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.797454 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:52.799354 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.799331 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:52.804960 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.804935 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-web-config" (OuterVolumeSpecName: "web-config") pod "9bd271b3-97e0-45e3-a870-19bfcbe723d4" (UID: "9bd271b3-97e0-45e3-a870-19bfcbe723d4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:52.807544 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.807521 2606 generic.go:358] "Generic (PLEG): container finished" podID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerID="c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57" exitCode=0 Apr 22 15:09:52.807544 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.807541 2606 generic.go:358] "Generic (PLEG): container finished" podID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerID="6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a" exitCode=0 Apr 22 15:09:52.807660 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.807624 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerDied","Data":"c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57"} Apr 22 15:09:52.807698 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.807661 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerDied","Data":"6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a"} Apr 22 15:09:52.807698 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.807673 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9bd271b3-97e0-45e3-a870-19bfcbe723d4","Type":"ContainerDied","Data":"81a2c423b00d41024b3e7d218800da7982551c04cd050e58c529401ee7fd89e6"} Apr 22 15:09:52.807698 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.807690 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.807783 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.807688 2606 scope.go:117] "RemoveContainer" containerID="6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f" Apr 22 15:09:52.815330 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.815304 2606 scope.go:117] "RemoveContainer" containerID="c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57" Apr 22 15:09:52.821897 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.821880 2606 scope.go:117] "RemoveContainer" containerID="bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d" Apr 22 15:09:52.828097 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.828077 2606 scope.go:117] "RemoveContainer" containerID="6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a" Apr 22 15:09:52.831624 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.831591 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:09:52.835087 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.835069 2606 scope.go:117] "RemoveContainer" containerID="19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5" Apr 22 15:09:52.839425 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.839404 2606 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:09:52.842945 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.842927 2606 scope.go:117] "RemoveContainer" containerID="f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950" Apr 22 15:09:52.849607 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.849569 2606 scope.go:117] "RemoveContainer" containerID="3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade" Apr 22 15:09:52.856050 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.856035 2606 scope.go:117] "RemoveContainer" containerID="6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f" Apr 22 15:09:52.856332 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:52.856315 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f\": container with ID starting with 6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f not found: ID does not exist" containerID="6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f" Apr 22 15:09:52.856437 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.856341 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f"} err="failed to get container status \"6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f\": rpc error: code = NotFound desc = could not find container \"6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f\": container with ID starting with 6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f not found: ID does not exist" Apr 22 15:09:52.856437 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.856394 2606 scope.go:117] "RemoveContainer" containerID="c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57" Apr 22 15:09:52.856718 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:52.856673 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57\": container with ID starting with c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57 not found: ID does not exist" containerID="c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57" Apr 22 15:09:52.856800 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.856708 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57"} err="failed to get container status \"c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57\": rpc error: code = NotFound desc = could not find container \"c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57\": container with ID starting with c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57 not found: ID does not exist" Apr 22 15:09:52.856800 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.856733 2606 scope.go:117] "RemoveContainer" containerID="bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d" Apr 22 15:09:52.857146 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:52.857121 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d\": container with ID starting with bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d not found: ID does not exist" containerID="bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d" Apr 22 15:09:52.857234 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.857153 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d"} err="failed to get container status \"bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d\": rpc error: code = NotFound desc = could not find container \"bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d\": container with ID starting with bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d not found: ID does not exist" Apr 22 15:09:52.857234 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.857174 2606 scope.go:117] "RemoveContainer" containerID="6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a" Apr 22 15:09:52.857460 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:52.857440 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a\": container with ID starting with 6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a not found: ID does not exist" containerID="6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a" Apr 22 15:09:52.857532 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.857469 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a"} err="failed to get container status \"6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a\": rpc error: code = NotFound desc = could not find container \"6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a\": container with ID starting with 6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a not found: ID does not exist" Apr 22 15:09:52.857532 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.857491 2606 scope.go:117] "RemoveContainer" containerID="19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5" Apr 22 15:09:52.857726 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:52.857708 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5\": container with ID starting with 19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5 not found: ID does not exist" containerID="19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5" Apr 22 15:09:52.857766 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.857741 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5"} err="failed to get container status \"19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5\": rpc error: code = NotFound desc = could not find container \"19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5\": container with ID starting with 19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5 not found: ID does not exist" Apr 22 15:09:52.857766 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.857757 2606 scope.go:117] "RemoveContainer" containerID="f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950" Apr 22 15:09:52.857957 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:52.857943 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950\": container with ID starting with f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950 not found: ID does not exist" containerID="f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950" Apr 22 15:09:52.857996 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.857962 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950"} err="failed to get container status \"f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950\": rpc error: code = NotFound desc = could not find container \"f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950\": container with ID starting with f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950 not found: ID does not exist" Apr 22 15:09:52.857996 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.857983 2606 scope.go:117] "RemoveContainer" containerID="3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade" Apr 22 15:09:52.858211 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:52.858195 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade\": container with ID starting with 3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade not found: ID does not exist" containerID="3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade" Apr 22 15:09:52.858246 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.858217 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade"} err="failed to get container status \"3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade\": rpc error: code = NotFound desc = could not find container \"3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade\": container with ID starting with 3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade not found: ID does not exist" Apr 22 15:09:52.858246 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.858231 2606 scope.go:117] "RemoveContainer" containerID="6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f" Apr 22 15:09:52.858490 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.858463 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f"} err="failed to get container status \"6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f\": rpc error: code = NotFound desc = could not find container \"6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f\": container with ID starting with 6f61a622663793e66d3f1170570f7b7d5abc09309817c15bbd0378637919568f not found: ID does not exist" Apr 22 15:09:52.858543 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.858498 2606 scope.go:117] "RemoveContainer" containerID="c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57" Apr 22 15:09:52.858712 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.858690 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57"} err="failed to get container status \"c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57\": rpc error: code = NotFound desc = could not find container \"c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57\": container with ID starting with c4eec8fd2c6b6d0a1887976a35231886a10f0b9e0b49153217df3f61e5aadd57 not found: ID does not exist" Apr 22 15:09:52.858712 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.858711 2606 scope.go:117] "RemoveContainer" containerID="bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d" Apr 22 15:09:52.858929 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.858914 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d"} err="failed to get container status \"bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d\": rpc error: code = NotFound desc = could not find container \"bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d\": container with ID starting with bd0493f614a1cd1108fc0bfc20ac4d27977b403fd161a26b36ba332d8670dd8d not found: ID does not exist" Apr 22 15:09:52.858993 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.858929 2606 scope.go:117] "RemoveContainer" containerID="6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a" Apr 22 15:09:52.859150 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.859133 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a"} err="failed to get container status \"6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a\": rpc error: code = NotFound desc = could not find container \"6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a\": container with ID starting with 6a9f1e858bfa3fb881318954f486a5f6ebb0bc3a220ad68f56b40d448a0d2e5a not found: ID does not exist" Apr 22 15:09:52.859192 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.859150 2606 scope.go:117] "RemoveContainer" containerID="19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5" Apr 22 15:09:52.859398 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.859354 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5"} err="failed to get container status \"19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5\": rpc error: code = NotFound desc = could not find container \"19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5\": container with ID starting with 19ffd579396ccbc414ad58b237f6fb4ffd9e5da371a66e55904fb0eb01d068b5 not found: ID does not exist" Apr 22 15:09:52.859398 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.859398 2606 scope.go:117] "RemoveContainer" containerID="f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950" Apr 22 15:09:52.859619 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.859603 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950"} err="failed to get container status \"f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950\": rpc error: code = NotFound desc = could not find container \"f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950\": container with ID starting with f288a80b9b42ed8cde3835b611e933aaf6364d8788db011ec44904426a47a950 not found: ID does not exist" Apr 22 15:09:52.859687 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.859619 2606 scope.go:117] "RemoveContainer" containerID="3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade" Apr 22 15:09:52.859820 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.859805 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade"} err="failed to get container status \"3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade\": rpc error: code = NotFound desc = could not find container \"3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade\": container with ID starting with 3c2d14ea0265eeb622587a6bdc11e780c3a2028a5c814da75fa3e9decbeb1ade not found: ID does not exist" Apr 22 15:09:52.869405 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869136 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869713 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869735 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869749 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy-metric" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869758 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy-metric" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869780 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="alertmanager" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869789 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="alertmanager" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869809 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0081f494-103e-486a-9b26-dd053bcd4b4b" containerName="console" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869817 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="0081f494-103e-486a-9b26-dd053bcd4b4b" containerName="console" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869831 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy-web" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869839 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy-web" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869857 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1e7115d-ddcd-45ec-90c8-5b49f1464536" containerName="console" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869865 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e7115d-ddcd-45ec-90c8-5b49f1464536" containerName="console" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869875 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="config-reloader" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869885 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="config-reloader" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869903 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="prom-label-proxy" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869911 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="prom-label-proxy" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869930 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="init-config-reloader" Apr 22 15:09:52.870032 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.869974 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="init-config-reloader" Apr 22 15:09:52.870663 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.870508 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="config-reloader" Apr 22 15:09:52.870663 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.870527 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy-metric" Apr 22 15:09:52.870663 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.870538 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1e7115d-ddcd-45ec-90c8-5b49f1464536" containerName="console" Apr 22 15:09:52.870663 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.870544 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="0081f494-103e-486a-9b26-dd053bcd4b4b" containerName="console" Apr 22 15:09:52.870663 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.870551 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="alertmanager" Apr 22 15:09:52.870663 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.870558 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy-web" Apr 22 15:09:52.870663 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.870565 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="kube-rbac-proxy" Apr 22 15:09:52.870663 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.870572 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" containerName="prom-label-proxy" Apr 22 15:09:52.875970 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.875953 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.878676 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.878652 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 15:09:52.878792 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.878665 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 15:09:52.878792 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.878699 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 15:09:52.878792 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.878716 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 15:09:52.878792 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.878716 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 15:09:52.878792 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.878661 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 15:09:52.879051 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.879027 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 15:09:52.879140 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.879125 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 15:09:52.879218 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.879205 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-97fgb\"" Apr 22 15:09:52.883177 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.883149 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:09:52.884797 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.884778 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 15:09:52.892961 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.892936 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j2gq\" (UniqueName: \"kubernetes.io/projected/16fb3581-9c8f-4729-ade0-66a65a61cb4b-kube-api-access-6j2gq\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893079 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.892978 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16fb3581-9c8f-4729-ade0-66a65a61cb4b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893079 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893031 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-config-volume\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893079 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893066 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16fb3581-9c8f-4729-ade0-66a65a61cb4b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893237 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893135 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16fb3581-9c8f-4729-ade0-66a65a61cb4b-config-out\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893237 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893159 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893237 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893196 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893237 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893229 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-web-config\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893449 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893284 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16fb3581-9c8f-4729-ade0-66a65a61cb4b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893449 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893306 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16fb3581-9c8f-4729-ade0-66a65a61cb4b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893449 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893334 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893449 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893379 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893449 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893408 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893467 2606 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-volume\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893485 2606 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-config-out\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893500 2606 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbxl9\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-kube-api-access-rbxl9\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893516 2606 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893532 2606 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9bd271b3-97e0-45e3-a870-19bfcbe723d4-tls-assets\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893547 2606 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-main-db\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893562 2606 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bd271b3-97e0-45e3-a870-19bfcbe723d4-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893577 2606 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893592 2606 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-main-tls\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893607 2606 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-cluster-tls-config\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893621 2606 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-web-config\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.893680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.893635 2606 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9bd271b3-97e0-45e3-a870-19bfcbe723d4-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:52.994644 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.994607 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16fb3581-9c8f-4729-ade0-66a65a61cb4b-config-out\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.994644 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.994645 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.994869 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.994669 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.994869 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.994784 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-web-config\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.994869 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.994843 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16fb3581-9c8f-4729-ade0-66a65a61cb4b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995023 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.994868 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16fb3581-9c8f-4729-ade0-66a65a61cb4b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995023 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.994897 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995023 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.994919 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995023 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.994957 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995023 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.995007 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j2gq\" (UniqueName: \"kubernetes.io/projected/16fb3581-9c8f-4729-ade0-66a65a61cb4b-kube-api-access-6j2gq\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995274 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.995046 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16fb3581-9c8f-4729-ade0-66a65a61cb4b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995274 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.995087 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-config-volume\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995274 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.995114 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16fb3581-9c8f-4729-ade0-66a65a61cb4b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995274 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.995245 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/16fb3581-9c8f-4729-ade0-66a65a61cb4b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.995860 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.995837 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16fb3581-9c8f-4729-ade0-66a65a61cb4b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.996172 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.996140 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16fb3581-9c8f-4729-ade0-66a65a61cb4b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.997925 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.997894 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-web-config\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.998035 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.997929 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/16fb3581-9c8f-4729-ade0-66a65a61cb4b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.998035 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.997938 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.998188 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.998165 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/16fb3581-9c8f-4729-ade0-66a65a61cb4b-config-out\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.998294 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.998274 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.998379 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.998342 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.998491 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.998476 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.998814 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.998793 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-config-volume\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:52.999655 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:52.999635 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/16fb3581-9c8f-4729-ade0-66a65a61cb4b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:53.004752 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:53.004725 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j2gq\" (UniqueName: \"kubernetes.io/projected/16fb3581-9c8f-4729-ade0-66a65a61cb4b-kube-api-access-6j2gq\") pod \"alertmanager-main-0\" (UID: \"16fb3581-9c8f-4729-ade0-66a65a61cb4b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:53.186786 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:53.186699 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:09:53.315740 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:53.315711 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:09:53.316664 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:53.316635 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fb3581_9c8f_4729_ade0_66a65a61cb4b.slice/crio-e916afc966555957d7541ebf3587325ba3c0a779c23d9f8a00eb4f19b6a9df42 WatchSource:0}: Error finding container e916afc966555957d7541ebf3587325ba3c0a779c23d9f8a00eb4f19b6a9df42: Status 404 returned error can't find the container with id e916afc966555957d7541ebf3587325ba3c0a779c23d9f8a00eb4f19b6a9df42 Apr 22 15:09:53.812285 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:53.812192 2606 generic.go:358] "Generic (PLEG): container finished" podID="16fb3581-9c8f-4729-ade0-66a65a61cb4b" containerID="15d5d8a243d575c85ce87076a957634797033efd4652ba496fdb44937c0acd1d" exitCode=0 Apr 22 15:09:53.812740 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:53.812287 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16fb3581-9c8f-4729-ade0-66a65a61cb4b","Type":"ContainerDied","Data":"15d5d8a243d575c85ce87076a957634797033efd4652ba496fdb44937c0acd1d"} Apr 22 15:09:53.812740 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:53.812327 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16fb3581-9c8f-4729-ade0-66a65a61cb4b","Type":"ContainerStarted","Data":"e916afc966555957d7541ebf3587325ba3c0a779c23d9f8a00eb4f19b6a9df42"} Apr 22 15:09:54.416290 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:54.416212 2606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd271b3-97e0-45e3-a870-19bfcbe723d4" path="/var/lib/kubelet/pods/9bd271b3-97e0-45e3-a870-19bfcbe723d4/volumes" Apr 22 15:09:54.819353 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:54.819312 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16fb3581-9c8f-4729-ade0-66a65a61cb4b","Type":"ContainerStarted","Data":"e6f033448a2ee1c17b987f259c687f554198d34b53b24d39569d9014ebc5527a"} Apr 22 15:09:54.819353 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:54.819350 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16fb3581-9c8f-4729-ade0-66a65a61cb4b","Type":"ContainerStarted","Data":"23fe2c30198aaf56b0cb33c1e6704227a4411e287080aa64480a332f9ec7dda8"} Apr 22 15:09:54.819353 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:54.819374 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16fb3581-9c8f-4729-ade0-66a65a61cb4b","Type":"ContainerStarted","Data":"99c5818fb54b7db575e930de3535442de256eabccd8daa1384e5278f5b338dbb"} Apr 22 15:09:54.819943 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:54.819384 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16fb3581-9c8f-4729-ade0-66a65a61cb4b","Type":"ContainerStarted","Data":"1bf7a90d3892383b87ee992e0a86de38573c07df375d8825ef42d425f5e88986"} Apr 22 15:09:54.819943 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:54.819392 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16fb3581-9c8f-4729-ade0-66a65a61cb4b","Type":"ContainerStarted","Data":"35530df4a0e2dce844a05c0fba71457134fdbea224b2045278f4775539097131"} Apr 22 15:09:54.819943 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:54.819400 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"16fb3581-9c8f-4729-ade0-66a65a61cb4b","Type":"ContainerStarted","Data":"2724e46332af834498a324bde42f8bdc40263220208b16e230d598cbd4bde454"} Apr 22 15:09:54.847503 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:54.847442 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.84742224 podStartE2EDuration="2.84742224s" podCreationTimestamp="2026-04-22 15:09:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:09:54.844938708 +0000 UTC m=+98.958887311" watchObservedRunningTime="2026-04-22 15:09:54.84742224 +0000 UTC m=+98.961370845" Apr 22 15:09:55.688070 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.686583 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:09:55.688070 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.687212 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="prometheus" containerID="cri-o://8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8" gracePeriod=600 Apr 22 15:09:55.688070 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.687576 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy-thanos" containerID="cri-o://1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f" gracePeriod=600 Apr 22 15:09:55.688070 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.687664 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy" containerID="cri-o://4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085" gracePeriod=600 Apr 22 15:09:55.688070 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.687713 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy-web" containerID="cri-o://a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d" gracePeriod=600 Apr 22 15:09:55.688070 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.687763 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="thanos-sidecar" containerID="cri-o://69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619" gracePeriod=600 Apr 22 15:09:55.688070 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.687811 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="config-reloader" containerID="cri-o://a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4" gracePeriod=600 Apr 22 15:09:55.825608 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825581 2606 generic.go:358] "Generic (PLEG): container finished" podID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerID="1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f" exitCode=0 Apr 22 15:09:55.825608 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825603 2606 generic.go:358] "Generic (PLEG): container finished" podID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerID="4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085" exitCode=0 Apr 22 15:09:55.825608 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825609 2606 generic.go:358] "Generic (PLEG): container finished" podID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerID="a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d" exitCode=0 Apr 22 15:09:55.825608 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825615 2606 generic.go:358] "Generic (PLEG): container finished" podID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerID="69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619" exitCode=0 Apr 22 15:09:55.825986 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825620 2606 generic.go:358] "Generic (PLEG): container finished" podID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerID="a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4" exitCode=0 Apr 22 15:09:55.825986 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825630 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerDied","Data":"1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f"} Apr 22 15:09:55.825986 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825665 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerDied","Data":"4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085"} Apr 22 15:09:55.825986 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825676 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerDied","Data":"a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d"} Apr 22 15:09:55.825986 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825699 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerDied","Data":"69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619"} Apr 22 15:09:55.825986 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:55.825714 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerDied","Data":"a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4"} Apr 22 15:09:56.466400 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.466376 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:56.524434 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524392 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-grpc-tls\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524434 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524435 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-metrics-client-ca\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524566 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd9vg\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-kube-api-access-jd9vg\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524612 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524641 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-serving-certs-ca-bundle\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524661 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-rulefiles-0\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524683 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-kubelet-serving-ca-bundle\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524704 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-trusted-ca-bundle\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524723 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-tls-assets\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524748 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-metrics-client-certs\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524772 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524806 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-web-config\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524831 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config-out\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.524887 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524883 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.525244 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524908 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-db\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.525244 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524935 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-kube-rbac-proxy\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.525244 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524962 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-tls\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.525244 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.525014 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-thanos-prometheus-http-client-file\") pod \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\" (UID: \"87e3dc13-4457-4fde-a62b-95e44b7d12eb\") " Apr 22 15:09:56.525652 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.524851 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:56.525774 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.525114 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:56.525877 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.525618 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:56.526272 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.526165 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:09:56.527104 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.526821 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:56.527722 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.527693 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:56.527827 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.527750 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:09:56.527884 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.527853 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:56.529138 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.529108 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-kube-api-access-jd9vg" (OuterVolumeSpecName: "kube-api-access-jd9vg") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "kube-api-access-jd9vg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:09:56.529596 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.529540 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config" (OuterVolumeSpecName: "config") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:56.530059 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.529995 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:56.530059 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.530044 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:56.530202 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.530123 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config-out" (OuterVolumeSpecName: "config-out") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:09:56.530202 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.530135 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:56.530469 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.530447 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:09:56.530568 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.530475 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:56.531213 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.531192 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:56.538300 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.538275 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-web-config" (OuterVolumeSpecName: "web-config") pod "87e3dc13-4457-4fde-a62b-95e44b7d12eb" (UID: "87e3dc13-4457-4fde-a62b-95e44b7d12eb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:09:56.625672 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625581 2606 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jd9vg\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-kube-api-access-jd9vg\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625672 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625622 2606 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625672 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625634 2606 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625672 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625646 2606 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625672 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625656 2606 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625672 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625666 2606 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-trusted-ca-bundle\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625672 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625676 2606 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87e3dc13-4457-4fde-a62b-95e44b7d12eb-tls-assets\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625684 2606 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-metrics-client-certs\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625693 2606 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625703 2606 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-web-config\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625711 2606 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-config-out\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625719 2606 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625727 2606 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87e3dc13-4457-4fde-a62b-95e44b7d12eb-prometheus-k8s-db\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625735 2606 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-kube-rbac-proxy\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625744 2606 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-prometheus-k8s-tls\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625753 2606 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-thanos-prometheus-http-client-file\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625761 2606 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87e3dc13-4457-4fde-a62b-95e44b7d12eb-secret-grpc-tls\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.625974 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.625769 2606 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87e3dc13-4457-4fde-a62b-95e44b7d12eb-configmap-metrics-client-ca\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:09:56.832473 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.832432 2606 generic.go:358] "Generic (PLEG): container finished" podID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerID="8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8" exitCode=0 Apr 22 15:09:56.832837 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.832493 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerDied","Data":"8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8"} Apr 22 15:09:56.832837 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.832541 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87e3dc13-4457-4fde-a62b-95e44b7d12eb","Type":"ContainerDied","Data":"cadc1bf2f0687623db7c339e5fcdb265633cbffe357175f63c2620a01fea3974"} Apr 22 15:09:56.832837 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.832557 2606 scope.go:117] "RemoveContainer" containerID="1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f" Apr 22 15:09:56.832837 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.832569 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:56.841098 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.841083 2606 scope.go:117] "RemoveContainer" containerID="4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085" Apr 22 15:09:56.848292 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.848273 2606 scope.go:117] "RemoveContainer" containerID="a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d" Apr 22 15:09:56.854992 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.854973 2606 scope.go:117] "RemoveContainer" containerID="69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619" Apr 22 15:09:56.858928 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.858901 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:09:56.862443 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.862426 2606 scope.go:117] "RemoveContainer" containerID="a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4" Apr 22 15:09:56.868845 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.868820 2606 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:09:56.869600 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.869583 2606 scope.go:117] "RemoveContainer" containerID="8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8" Apr 22 15:09:56.876666 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.876644 2606 scope.go:117] "RemoveContainer" containerID="e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3" Apr 22 15:09:56.882961 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.882943 2606 scope.go:117] "RemoveContainer" containerID="1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f" Apr 22 15:09:56.883190 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:56.883174 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f\": container with ID starting with 1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f not found: ID does not exist" containerID="1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f" Apr 22 15:09:56.883231 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.883200 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f"} err="failed to get container status \"1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f\": rpc error: code = NotFound desc = could not find container \"1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f\": container with ID starting with 1bb1dea6862fa50fa113927037dfbe540861a4e0afb43d59a6b382b6ae00e87f not found: ID does not exist" Apr 22 15:09:56.883231 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.883218 2606 scope.go:117] "RemoveContainer" containerID="4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085" Apr 22 15:09:56.883495 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:56.883476 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085\": container with ID starting with 4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085 not found: ID does not exist" containerID="4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085" Apr 22 15:09:56.883545 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.883503 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085"} err="failed to get container status \"4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085\": rpc error: code = NotFound desc = could not find container \"4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085\": container with ID starting with 4b7431faeb3f4eff7d3f1e13f1cdb2caa9032fa2c5ca002d9cebfc43fdefd085 not found: ID does not exist" Apr 22 15:09:56.883545 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.883521 2606 scope.go:117] "RemoveContainer" containerID="a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d" Apr 22 15:09:56.883760 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:56.883743 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d\": container with ID starting with a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d not found: ID does not exist" containerID="a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d" Apr 22 15:09:56.883810 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.883762 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d"} err="failed to get container status \"a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d\": rpc error: code = NotFound desc = could not find container \"a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d\": container with ID starting with a9ceb3d41486f943c92cc87fea2e59f8e742716cd73dbe23db4d231e6d4ad96d not found: ID does not exist" Apr 22 15:09:56.883810 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.883775 2606 scope.go:117] "RemoveContainer" containerID="69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619" Apr 22 15:09:56.883997 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:56.883983 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619\": container with ID starting with 69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619 not found: ID does not exist" containerID="69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619" Apr 22 15:09:56.884036 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.883998 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619"} err="failed to get container status \"69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619\": rpc error: code = NotFound desc = could not find container \"69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619\": container with ID starting with 69abd4b9ba00c894207848ddb57b98cf2e2f1313573e8fec17c765e01d58c619 not found: ID does not exist" Apr 22 15:09:56.884036 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.884012 2606 scope.go:117] "RemoveContainer" containerID="a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4" Apr 22 15:09:56.884196 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:56.884182 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4\": container with ID starting with a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4 not found: ID does not exist" containerID="a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4" Apr 22 15:09:56.884231 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.884199 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4"} err="failed to get container status \"a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4\": rpc error: code = NotFound desc = could not find container \"a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4\": container with ID starting with a673bbbfed2776176c260a60c47d853dc974ebcb9b51d6928c776974a3dfe7e4 not found: ID does not exist" Apr 22 15:09:56.884231 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.884212 2606 scope.go:117] "RemoveContainer" containerID="8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8" Apr 22 15:09:56.884479 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:56.884453 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8\": container with ID starting with 8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8 not found: ID does not exist" containerID="8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8" Apr 22 15:09:56.884545 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.884484 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8"} err="failed to get container status \"8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8\": rpc error: code = NotFound desc = could not find container \"8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8\": container with ID starting with 8fb11de58dd7943e4bcc22b7b46674105014a17e1abeb2135cf83f3cff6fddf8 not found: ID does not exist" Apr 22 15:09:56.884545 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.884499 2606 scope.go:117] "RemoveContainer" containerID="e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3" Apr 22 15:09:56.884725 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:09:56.884709 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3\": container with ID starting with e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3 not found: ID does not exist" containerID="e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3" Apr 22 15:09:56.884778 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.884729 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3"} err="failed to get container status \"e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3\": rpc error: code = NotFound desc = could not find container \"e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3\": container with ID starting with e2c641f1d5d346c3c1389fa0d897b876baeeb4c70d1ee0420d3082ffdec62cc3 not found: ID does not exist" Apr 22 15:09:56.900143 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900116 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:09:56.900448 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900434 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="config-reloader" Apr 22 15:09:56.900516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900449 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="config-reloader" Apr 22 15:09:56.900516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900467 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy" Apr 22 15:09:56.900516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900473 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy" Apr 22 15:09:56.900516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900483 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="prometheus" Apr 22 15:09:56.900516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900490 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="prometheus" Apr 22 15:09:56.900516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900503 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="thanos-sidecar" Apr 22 15:09:56.900516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900508 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="thanos-sidecar" Apr 22 15:09:56.900516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900514 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy-thanos" Apr 22 15:09:56.900516 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900519 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy-thanos" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900528 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="init-config-reloader" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900533 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="init-config-reloader" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900540 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy-web" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900545 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy-web" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900584 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="config-reloader" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900591 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy-web" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900599 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900606 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="thanos-sidecar" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900611 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="prometheus" Apr 22 15:09:56.900920 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.900617 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" containerName="kube-rbac-proxy-thanos" Apr 22 15:09:56.918806 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.918781 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:56.921795 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921516 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 15:09:56.921795 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921545 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:09:56.921795 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921584 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 15:09:56.921795 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921593 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 15:09:56.921795 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921523 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 15:09:56.921795 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921731 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 15:09:56.921795 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921727 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 15:09:56.922145 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921870 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 15:09:56.922145 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921891 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 15:09:56.922145 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.921977 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 15:09:56.922145 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.922133 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 15:09:56.922639 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.922523 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-r5mdm\"" Apr 22 15:09:56.922639 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.922605 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-a34g3geebnt8t\"" Apr 22 15:09:56.922890 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.922871 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 15:09:56.925067 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.925050 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 15:09:56.928190 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:56.928147 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 15:09:57.028279 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028243 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028279 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028283 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028510 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028353 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028510 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028410 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028510 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028433 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7x9h\" (UniqueName: \"kubernetes.io/projected/95b61066-9752-47f7-8f27-f6ea6c8aa282-kube-api-access-c7x9h\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028510 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028482 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-config\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028510 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028511 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028729 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028531 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028729 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028547 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/95b61066-9752-47f7-8f27-f6ea6c8aa282-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028729 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028638 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028729 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028676 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028729 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028711 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95b61066-9752-47f7-8f27-f6ea6c8aa282-config-out\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028914 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028733 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-web-config\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028914 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028767 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95b61066-9752-47f7-8f27-f6ea6c8aa282-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028914 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028795 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028914 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028836 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028914 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028864 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.028914 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.028885 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.129885 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.129776 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.129885 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.129845 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.129885 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.129874 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130171 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.129899 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130171 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.129922 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130171 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.129957 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130171 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130024 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130171 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130065 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7x9h\" (UniqueName: \"kubernetes.io/projected/95b61066-9752-47f7-8f27-f6ea6c8aa282-kube-api-access-c7x9h\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130171 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130101 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-config\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130171 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130129 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130569 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130515 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130569 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130555 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/95b61066-9752-47f7-8f27-f6ea6c8aa282-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130674 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130605 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130674 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130641 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130772 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130683 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95b61066-9752-47f7-8f27-f6ea6c8aa282-config-out\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130772 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130711 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-web-config\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130772 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130737 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95b61066-9752-47f7-8f27-f6ea6c8aa282-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130772 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130764 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130961 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130814 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.130961 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130900 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.131102 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.130993 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.133027 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.132999 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.133027 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.133018 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-config\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.133235 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.133007 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.133346 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.133323 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/95b61066-9752-47f7-8f27-f6ea6c8aa282-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.133743 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.133718 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.133836 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.133744 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95b61066-9752-47f7-8f27-f6ea6c8aa282-config-out\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.133836 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.133754 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.133836 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.133813 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.134380 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.134343 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.135500 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.135478 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-web-config\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.135620 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.135603 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95b61066-9752-47f7-8f27-f6ea6c8aa282-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.135941 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.135918 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/95b61066-9752-47f7-8f27-f6ea6c8aa282-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.136231 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.136212 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.136293 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.136213 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/95b61066-9752-47f7-8f27-f6ea6c8aa282-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.138548 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.138531 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7x9h\" (UniqueName: \"kubernetes.io/projected/95b61066-9752-47f7-8f27-f6ea6c8aa282-kube-api-access-c7x9h\") pod \"prometheus-k8s-0\" (UID: \"95b61066-9752-47f7-8f27-f6ea6c8aa282\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.231824 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.231780 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:09:57.358716 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.358643 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:09:57.361155 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:09:57.361127 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95b61066_9752_47f7_8f27_f6ea6c8aa282.slice/crio-c3b78a14babae285ed4e6a784e6fc6875ee833a9eab305040c1a3846e11e890b WatchSource:0}: Error finding container c3b78a14babae285ed4e6a784e6fc6875ee833a9eab305040c1a3846e11e890b: Status 404 returned error can't find the container with id c3b78a14babae285ed4e6a784e6fc6875ee833a9eab305040c1a3846e11e890b Apr 22 15:09:57.841724 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.841691 2606 generic.go:358] "Generic (PLEG): container finished" podID="95b61066-9752-47f7-8f27-f6ea6c8aa282" containerID="a47530e2df487dbfee5aa5457cdb3a2aa29e23516492a69f72a8bc7fb0865d7e" exitCode=0 Apr 22 15:09:57.842149 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.841756 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"95b61066-9752-47f7-8f27-f6ea6c8aa282","Type":"ContainerDied","Data":"a47530e2df487dbfee5aa5457cdb3a2aa29e23516492a69f72a8bc7fb0865d7e"} Apr 22 15:09:57.842149 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:57.841781 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"95b61066-9752-47f7-8f27-f6ea6c8aa282","Type":"ContainerStarted","Data":"c3b78a14babae285ed4e6a784e6fc6875ee833a9eab305040c1a3846e11e890b"} Apr 22 15:09:58.416833 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:58.416797 2606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e3dc13-4457-4fde-a62b-95e44b7d12eb" path="/var/lib/kubelet/pods/87e3dc13-4457-4fde-a62b-95e44b7d12eb/volumes" Apr 22 15:09:58.730555 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:58.730481 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-w6z28" Apr 22 15:09:58.847620 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:58.847584 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"95b61066-9752-47f7-8f27-f6ea6c8aa282","Type":"ContainerStarted","Data":"e2dad41449f6f0ab522fd105dbb18aa739c36b83f9d183b4ac85cb67cd5c9f8f"} Apr 22 15:09:58.847620 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:58.847621 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"95b61066-9752-47f7-8f27-f6ea6c8aa282","Type":"ContainerStarted","Data":"c90d072694724d1815b84b229f57eb7c75c5b376291c079e533102a227c8b29b"} Apr 22 15:09:58.848082 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:58.847634 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"95b61066-9752-47f7-8f27-f6ea6c8aa282","Type":"ContainerStarted","Data":"7bcb119624d033c5f4892a5c48d589124919d90231f3e2d1d84c248844195572"} Apr 22 15:09:58.848082 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:58.847648 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"95b61066-9752-47f7-8f27-f6ea6c8aa282","Type":"ContainerStarted","Data":"dc708735574f5c9a9a80f49427a10511817ce36c9b61c404cc2f0a39716ebe8e"} Apr 22 15:09:58.848082 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:58.847658 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"95b61066-9752-47f7-8f27-f6ea6c8aa282","Type":"ContainerStarted","Data":"e7de600059b6a98f213fd7238e5682595f92ca14ddeaf359ce5adab9b5e303f3"} Apr 22 15:09:58.848082 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:58.847670 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"95b61066-9752-47f7-8f27-f6ea6c8aa282","Type":"ContainerStarted","Data":"bcfa94f9b84447b9b57c9f07eb9438b797693951e06ee748bac49b17fe71e0b6"} Apr 22 15:09:58.873585 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:09:58.873521 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.873500504 podStartE2EDuration="2.873500504s" podCreationTimestamp="2026-04-22 15:09:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:09:58.87171842 +0000 UTC m=+102.985667025" watchObservedRunningTime="2026-04-22 15:09:58.873500504 +0000 UTC m=+102.987449108" Apr 22 15:10:02.232443 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:02.232392 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:10:48.205931 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.205896 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b"] Apr 22 15:10:48.209065 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.209049 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" Apr 22 15:10:48.211735 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.211713 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 15:10:48.212893 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.212869 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 15:10:48.213017 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.212869 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 15:10:48.213017 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.212872 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 15:10:48.221632 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.221607 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b"] Apr 22 15:10:48.278305 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.278272 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3797b21d-b518-4a5a-8b2a-ff2bbba9932d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-bd646c5fc-7gz2b\" (UID: \"3797b21d-b518-4a5a-8b2a-ff2bbba9932d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" Apr 22 15:10:48.278493 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.278402 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8l8c\" (UniqueName: \"kubernetes.io/projected/3797b21d-b518-4a5a-8b2a-ff2bbba9932d-kube-api-access-p8l8c\") pod \"managed-serviceaccount-addon-agent-bd646c5fc-7gz2b\" (UID: \"3797b21d-b518-4a5a-8b2a-ff2bbba9932d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" Apr 22 15:10:48.379746 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.379708 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8l8c\" (UniqueName: \"kubernetes.io/projected/3797b21d-b518-4a5a-8b2a-ff2bbba9932d-kube-api-access-p8l8c\") pod \"managed-serviceaccount-addon-agent-bd646c5fc-7gz2b\" (UID: \"3797b21d-b518-4a5a-8b2a-ff2bbba9932d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" Apr 22 15:10:48.379746 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.379749 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3797b21d-b518-4a5a-8b2a-ff2bbba9932d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-bd646c5fc-7gz2b\" (UID: \"3797b21d-b518-4a5a-8b2a-ff2bbba9932d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" Apr 22 15:10:48.382220 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.382189 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3797b21d-b518-4a5a-8b2a-ff2bbba9932d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-bd646c5fc-7gz2b\" (UID: \"3797b21d-b518-4a5a-8b2a-ff2bbba9932d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" Apr 22 15:10:48.390197 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.390174 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8l8c\" (UniqueName: \"kubernetes.io/projected/3797b21d-b518-4a5a-8b2a-ff2bbba9932d-kube-api-access-p8l8c\") pod \"managed-serviceaccount-addon-agent-bd646c5fc-7gz2b\" (UID: \"3797b21d-b518-4a5a-8b2a-ff2bbba9932d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" Apr 22 15:10:48.528275 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.528245 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" Apr 22 15:10:48.647882 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.647853 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b"] Apr 22 15:10:48.650778 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:10:48.650749 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3797b21d_b518_4a5a_8b2a_ff2bbba9932d.slice/crio-1f17094ec3957d4f28827c47bae4f6a57b52b0c2b69c9f39d34b29e0aa6d4612 WatchSource:0}: Error finding container 1f17094ec3957d4f28827c47bae4f6a57b52b0c2b69c9f39d34b29e0aa6d4612: Status 404 returned error can't find the container with id 1f17094ec3957d4f28827c47bae4f6a57b52b0c2b69c9f39d34b29e0aa6d4612 Apr 22 15:10:48.999100 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:48.999011 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" event={"ID":"3797b21d-b518-4a5a-8b2a-ff2bbba9932d","Type":"ContainerStarted","Data":"1f17094ec3957d4f28827c47bae4f6a57b52b0c2b69c9f39d34b29e0aa6d4612"} Apr 22 15:10:53.011825 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:53.011787 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" event={"ID":"3797b21d-b518-4a5a-8b2a-ff2bbba9932d","Type":"ContainerStarted","Data":"3812eccce3398c590a739a9e1c29785cfa1de4db7ba00bd83592ece3339b2441"} Apr 22 15:10:53.029335 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:53.029282 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bd646c5fc-7gz2b" podStartSLOduration=1.13863233 podStartE2EDuration="5.029267804s" podCreationTimestamp="2026-04-22 15:10:48 +0000 UTC" firstStartedPulling="2026-04-22 15:10:48.652416719 +0000 UTC m=+152.766365301" lastFinishedPulling="2026-04-22 15:10:52.54305219 +0000 UTC m=+156.657000775" observedRunningTime="2026-04-22 15:10:53.028120993 +0000 UTC m=+157.142069596" watchObservedRunningTime="2026-04-22 15:10:53.029267804 +0000 UTC m=+157.143216407" Apr 22 15:10:57.232456 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:57.232411 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:10:57.247828 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:57.247803 2606 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:10:58.042950 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:10:58.042925 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:11:45.244234 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.244192 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-qlpjf"] Apr 22 15:11:45.247621 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.247596 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:11:45.250168 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.250148 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 15:11:45.250286 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.250185 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-mx8rl\"" Apr 22 15:11:45.250286 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.250263 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 15:11:45.254997 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.254711 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-qlpjf"] Apr 22 15:11:45.349354 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.349314 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfb6m\" (UniqueName: \"kubernetes.io/projected/19317392-89a4-4c43-ae94-418ac7f92c0f-kube-api-access-gfb6m\") pod \"cert-manager-webhook-587ccfb98-qlpjf\" (UID: \"19317392-89a4-4c43-ae94-418ac7f92c0f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:11:45.349542 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.349395 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19317392-89a4-4c43-ae94-418ac7f92c0f-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-qlpjf\" (UID: \"19317392-89a4-4c43-ae94-418ac7f92c0f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:11:45.450264 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.450228 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19317392-89a4-4c43-ae94-418ac7f92c0f-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-qlpjf\" (UID: \"19317392-89a4-4c43-ae94-418ac7f92c0f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:11:45.450534 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.450299 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfb6m\" (UniqueName: \"kubernetes.io/projected/19317392-89a4-4c43-ae94-418ac7f92c0f-kube-api-access-gfb6m\") pod \"cert-manager-webhook-587ccfb98-qlpjf\" (UID: \"19317392-89a4-4c43-ae94-418ac7f92c0f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:11:45.459392 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.459348 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19317392-89a4-4c43-ae94-418ac7f92c0f-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-qlpjf\" (UID: \"19317392-89a4-4c43-ae94-418ac7f92c0f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:11:45.459562 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.459546 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfb6m\" (UniqueName: \"kubernetes.io/projected/19317392-89a4-4c43-ae94-418ac7f92c0f-kube-api-access-gfb6m\") pod \"cert-manager-webhook-587ccfb98-qlpjf\" (UID: \"19317392-89a4-4c43-ae94-418ac7f92c0f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:11:45.557701 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.557672 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:11:45.678672 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:45.678647 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-qlpjf"] Apr 22 15:11:45.681442 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:11:45.681413 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19317392_89a4_4c43_ae94_418ac7f92c0f.slice/crio-aec7dff4c697af65d72ebc359e53613a5a8176384783e76795113ec3481edff4 WatchSource:0}: Error finding container aec7dff4c697af65d72ebc359e53613a5a8176384783e76795113ec3481edff4: Status 404 returned error can't find the container with id aec7dff4c697af65d72ebc359e53613a5a8176384783e76795113ec3481edff4 Apr 22 15:11:46.169027 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:46.168985 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" event={"ID":"19317392-89a4-4c43-ae94-418ac7f92c0f","Type":"ContainerStarted","Data":"aec7dff4c697af65d72ebc359e53613a5a8176384783e76795113ec3481edff4"} Apr 22 15:11:48.537422 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.537395 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-sp85g"] Apr 22 15:11:48.540549 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.540531 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-sp85g" Apr 22 15:11:48.542876 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.542854 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-7qfg5\"" Apr 22 15:11:48.550902 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.550878 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-sp85g"] Apr 22 15:11:48.585561 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.585523 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8vgb\" (UniqueName: \"kubernetes.io/projected/508c1961-fcfe-4d3d-b776-37943b25c9d1-kube-api-access-c8vgb\") pod \"cert-manager-79c8d999ff-sp85g\" (UID: \"508c1961-fcfe-4d3d-b776-37943b25c9d1\") " pod="cert-manager/cert-manager-79c8d999ff-sp85g" Apr 22 15:11:48.585709 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.585692 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/508c1961-fcfe-4d3d-b776-37943b25c9d1-bound-sa-token\") pod \"cert-manager-79c8d999ff-sp85g\" (UID: \"508c1961-fcfe-4d3d-b776-37943b25c9d1\") " pod="cert-manager/cert-manager-79c8d999ff-sp85g" Apr 22 15:11:48.686942 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.686903 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8vgb\" (UniqueName: \"kubernetes.io/projected/508c1961-fcfe-4d3d-b776-37943b25c9d1-kube-api-access-c8vgb\") pod \"cert-manager-79c8d999ff-sp85g\" (UID: \"508c1961-fcfe-4d3d-b776-37943b25c9d1\") " pod="cert-manager/cert-manager-79c8d999ff-sp85g" Apr 22 15:11:48.687121 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.687013 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/508c1961-fcfe-4d3d-b776-37943b25c9d1-bound-sa-token\") pod \"cert-manager-79c8d999ff-sp85g\" (UID: \"508c1961-fcfe-4d3d-b776-37943b25c9d1\") " pod="cert-manager/cert-manager-79c8d999ff-sp85g" Apr 22 15:11:48.694430 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.694404 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/508c1961-fcfe-4d3d-b776-37943b25c9d1-bound-sa-token\") pod \"cert-manager-79c8d999ff-sp85g\" (UID: \"508c1961-fcfe-4d3d-b776-37943b25c9d1\") " pod="cert-manager/cert-manager-79c8d999ff-sp85g" Apr 22 15:11:48.694531 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.694511 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8vgb\" (UniqueName: \"kubernetes.io/projected/508c1961-fcfe-4d3d-b776-37943b25c9d1-kube-api-access-c8vgb\") pod \"cert-manager-79c8d999ff-sp85g\" (UID: \"508c1961-fcfe-4d3d-b776-37943b25c9d1\") " pod="cert-manager/cert-manager-79c8d999ff-sp85g" Apr 22 15:11:48.851276 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.851240 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-sp85g" Apr 22 15:11:48.969539 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:48.969510 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-sp85g"] Apr 22 15:11:48.972609 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:11:48.972584 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod508c1961_fcfe_4d3d_b776_37943b25c9d1.slice/crio-5bda71b2a7983c0163dfb99dab7e0802a3f96116b6b5b1869f428747904a1872 WatchSource:0}: Error finding container 5bda71b2a7983c0163dfb99dab7e0802a3f96116b6b5b1869f428747904a1872: Status 404 returned error can't find the container with id 5bda71b2a7983c0163dfb99dab7e0802a3f96116b6b5b1869f428747904a1872 Apr 22 15:11:49.185984 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:49.185899 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" event={"ID":"19317392-89a4-4c43-ae94-418ac7f92c0f","Type":"ContainerStarted","Data":"22f8f935626d3f7c9ee5395845b43e7c861095399caf2a2ff72ed454183d353b"} Apr 22 15:11:49.185984 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:49.185967 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:11:49.187222 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:49.187194 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-sp85g" event={"ID":"508c1961-fcfe-4d3d-b776-37943b25c9d1","Type":"ContainerStarted","Data":"34b7d470c5e61f2bd30c8d0b403f17546a8764528ceb9b14589c5d206410e10e"} Apr 22 15:11:49.187302 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:49.187223 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-sp85g" event={"ID":"508c1961-fcfe-4d3d-b776-37943b25c9d1","Type":"ContainerStarted","Data":"5bda71b2a7983c0163dfb99dab7e0802a3f96116b6b5b1869f428747904a1872"} Apr 22 15:11:49.200827 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:49.200779 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" podStartSLOduration=1.413119201 podStartE2EDuration="4.200764311s" podCreationTimestamp="2026-04-22 15:11:45 +0000 UTC" firstStartedPulling="2026-04-22 15:11:45.683175878 +0000 UTC m=+209.797124460" lastFinishedPulling="2026-04-22 15:11:48.470820988 +0000 UTC m=+212.584769570" observedRunningTime="2026-04-22 15:11:49.199904275 +0000 UTC m=+213.313852882" watchObservedRunningTime="2026-04-22 15:11:49.200764311 +0000 UTC m=+213.314712915" Apr 22 15:11:49.214095 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:49.214033 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-sp85g" podStartSLOduration=1.214016657 podStartE2EDuration="1.214016657s" podCreationTimestamp="2026-04-22 15:11:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:11:49.213868479 +0000 UTC m=+213.327817083" watchObservedRunningTime="2026-04-22 15:11:49.214016657 +0000 UTC m=+213.327965262" Apr 22 15:11:55.191803 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:11:55.191769 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-qlpjf" Apr 22 15:12:37.272533 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.272448 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67"] Apr 22 15:12:37.275657 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.275640 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.278468 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.278442 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 22 15:12:37.278594 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.278471 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 22 15:12:37.278594 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.278448 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 15:12:37.279566 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.279542 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-gdpzp\"" Apr 22 15:12:37.279667 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.279552 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 15:12:37.289786 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.289766 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67"] Apr 22 15:12:37.381730 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.381694 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/a1934745-764b-47c3-9de5-c57c40ccd36e-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-bnc67\" (UID: \"a1934745-764b-47c3-9de5-c57c40ccd36e\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.381907 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.381769 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1934745-764b-47c3-9de5-c57c40ccd36e-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-bnc67\" (UID: \"a1934745-764b-47c3-9de5-c57c40ccd36e\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.381907 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.381812 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqbsm\" (UniqueName: \"kubernetes.io/projected/a1934745-764b-47c3-9de5-c57c40ccd36e-kube-api-access-jqbsm\") pod \"kubeflow-trainer-controller-manager-55f5694779-bnc67\" (UID: \"a1934745-764b-47c3-9de5-c57c40ccd36e\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.483092 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.483058 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1934745-764b-47c3-9de5-c57c40ccd36e-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-bnc67\" (UID: \"a1934745-764b-47c3-9de5-c57c40ccd36e\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.483272 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.483104 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqbsm\" (UniqueName: \"kubernetes.io/projected/a1934745-764b-47c3-9de5-c57c40ccd36e-kube-api-access-jqbsm\") pod \"kubeflow-trainer-controller-manager-55f5694779-bnc67\" (UID: \"a1934745-764b-47c3-9de5-c57c40ccd36e\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.483272 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.483169 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/a1934745-764b-47c3-9de5-c57c40ccd36e-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-bnc67\" (UID: \"a1934745-764b-47c3-9de5-c57c40ccd36e\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.483840 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.483817 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/a1934745-764b-47c3-9de5-c57c40ccd36e-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-bnc67\" (UID: \"a1934745-764b-47c3-9de5-c57c40ccd36e\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.485536 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.485516 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1934745-764b-47c3-9de5-c57c40ccd36e-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-bnc67\" (UID: \"a1934745-764b-47c3-9de5-c57c40ccd36e\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.491619 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.491589 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqbsm\" (UniqueName: \"kubernetes.io/projected/a1934745-764b-47c3-9de5-c57c40ccd36e-kube-api-access-jqbsm\") pod \"kubeflow-trainer-controller-manager-55f5694779-bnc67\" (UID: \"a1934745-764b-47c3-9de5-c57c40ccd36e\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.585022 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.584940 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:37.705214 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:37.705190 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67"] Apr 22 15:12:37.707840 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:12:37.707812 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1934745_764b_47c3_9de5_c57c40ccd36e.slice/crio-5eb013a2292deb8d31874897c7692354b03f77cc28a969bcca64fe17195f9ae9 WatchSource:0}: Error finding container 5eb013a2292deb8d31874897c7692354b03f77cc28a969bcca64fe17195f9ae9: Status 404 returned error can't find the container with id 5eb013a2292deb8d31874897c7692354b03f77cc28a969bcca64fe17195f9ae9 Apr 22 15:12:38.326092 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:38.326053 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" event={"ID":"a1934745-764b-47c3-9de5-c57c40ccd36e","Type":"ContainerStarted","Data":"5eb013a2292deb8d31874897c7692354b03f77cc28a969bcca64fe17195f9ae9"} Apr 22 15:12:40.334220 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:40.334134 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" event={"ID":"a1934745-764b-47c3-9de5-c57c40ccd36e","Type":"ContainerStarted","Data":"515cfd2c19bd3217ce816b5c0d9075a49dd625a67a6a27775365fb54051020fa"} Apr 22 15:12:40.334220 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:40.334213 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:12:40.350353 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:40.350293 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" podStartSLOduration=1.031929999 podStartE2EDuration="3.350276511s" podCreationTimestamp="2026-04-22 15:12:37 +0000 UTC" firstStartedPulling="2026-04-22 15:12:37.709507557 +0000 UTC m=+261.823456140" lastFinishedPulling="2026-04-22 15:12:40.027854056 +0000 UTC m=+264.141802652" observedRunningTime="2026-04-22 15:12:40.348848379 +0000 UTC m=+264.462796987" watchObservedRunningTime="2026-04-22 15:12:40.350276511 +0000 UTC m=+264.464225116" Apr 22 15:12:56.341968 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:12:56.341915 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-bnc67" Apr 22 15:13:16.294989 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:13:16.294958 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:13:16.305733 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:13:16.305704 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:13:16.307600 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:13:16.307576 2606 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 15:18:16.325316 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:18:16.325276 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:18:16.329682 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:18:16.329654 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:23:16.352089 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:23:16.352059 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:23:16.352597 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:23:16.352061 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:28:16.375490 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:28:16.375382 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:28:16.377519 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:28:16.376038 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:33:16.396421 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:33:16.396298 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:33:16.398411 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:33:16.397632 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:38:16.418311 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:38:16.418220 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:38:16.421345 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:38:16.419749 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:43:16.438618 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:43:16.438493 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:43:16.442676 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:43:16.440443 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:48:16.460635 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:48:16.460516 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:48:16.465110 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:48:16.462497 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:53:16.483553 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:53:16.483434 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:53:16.487936 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:53:16.486086 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:54:55.585229 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.585144 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4mnwn/must-gather-dx8c5"] Apr 22 15:54:55.588629 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.588607 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:54:55.591043 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.591024 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4mnwn\"/\"kube-root-ca.crt\"" Apr 22 15:54:55.591378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.591345 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4mnwn\"/\"openshift-service-ca.crt\"" Apr 22 15:54:55.592243 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.592226 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4mnwn\"/\"default-dockercfg-jpkxl\"" Apr 22 15:54:55.601901 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.601879 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4mnwn/must-gather-dx8c5"] Apr 22 15:54:55.617333 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.617309 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq4jz\" (UniqueName: \"kubernetes.io/projected/f6c1e231-a940-4f4f-88db-1cc7366ee08d-kube-api-access-jq4jz\") pod \"must-gather-dx8c5\" (UID: \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\") " pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:54:55.617454 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.617418 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f6c1e231-a940-4f4f-88db-1cc7366ee08d-must-gather-output\") pod \"must-gather-dx8c5\" (UID: \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\") " pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:54:55.718627 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.718593 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4jz\" (UniqueName: \"kubernetes.io/projected/f6c1e231-a940-4f4f-88db-1cc7366ee08d-kube-api-access-jq4jz\") pod \"must-gather-dx8c5\" (UID: \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\") " pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:54:55.718785 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.718646 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f6c1e231-a940-4f4f-88db-1cc7366ee08d-must-gather-output\") pod \"must-gather-dx8c5\" (UID: \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\") " pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:54:55.718995 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.718977 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f6c1e231-a940-4f4f-88db-1cc7366ee08d-must-gather-output\") pod \"must-gather-dx8c5\" (UID: \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\") " pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:54:55.725984 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.725965 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq4jz\" (UniqueName: \"kubernetes.io/projected/f6c1e231-a940-4f4f-88db-1cc7366ee08d-kube-api-access-jq4jz\") pod \"must-gather-dx8c5\" (UID: \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\") " pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:54:55.897617 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:55.897531 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:54:56.018064 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:56.018029 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4mnwn/must-gather-dx8c5"] Apr 22 15:54:56.021134 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:54:56.021107 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6c1e231_a940_4f4f_88db_1cc7366ee08d.slice/crio-85a21ab0b40823acc534e5054d4bf353488296c8b8bc2c7e9fa6c6b979053e10 WatchSource:0}: Error finding container 85a21ab0b40823acc534e5054d4bf353488296c8b8bc2c7e9fa6c6b979053e10: Status 404 returned error can't find the container with id 85a21ab0b40823acc534e5054d4bf353488296c8b8bc2c7e9fa6c6b979053e10 Apr 22 15:54:56.022913 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:56.022891 2606 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:54:56.700204 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:54:56.700168 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" event={"ID":"f6c1e231-a940-4f4f-88db-1cc7366ee08d","Type":"ContainerStarted","Data":"85a21ab0b40823acc534e5054d4bf353488296c8b8bc2c7e9fa6c6b979053e10"} Apr 22 15:55:01.717656 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:01.717617 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" event={"ID":"f6c1e231-a940-4f4f-88db-1cc7366ee08d","Type":"ContainerStarted","Data":"52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607"} Apr 22 15:55:01.717656 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:01.717659 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" event={"ID":"f6c1e231-a940-4f4f-88db-1cc7366ee08d","Type":"ContainerStarted","Data":"7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd"} Apr 22 15:55:01.733862 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:01.733811 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" podStartSLOduration=1.6540367790000001 podStartE2EDuration="6.733794753s" podCreationTimestamp="2026-04-22 15:54:55 +0000 UTC" firstStartedPulling="2026-04-22 15:54:56.023079875 +0000 UTC m=+2800.137028472" lastFinishedPulling="2026-04-22 15:55:01.102837864 +0000 UTC m=+2805.216786446" observedRunningTime="2026-04-22 15:55:01.73177084 +0000 UTC m=+2805.845719456" watchObservedRunningTime="2026-04-22 15:55:01.733794753 +0000 UTC m=+2805.847743357" Apr 22 15:55:10.912994 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:10.912961 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-bnc67_a1934745-764b-47c3-9de5-c57c40ccd36e/manager/0.log" Apr 22 15:55:11.344112 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:11.344078 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-bnc67_a1934745-764b-47c3-9de5-c57c40ccd36e/manager/0.log" Apr 22 15:55:11.785621 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:11.785585 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-bnc67_a1934745-764b-47c3-9de5-c57c40ccd36e/manager/0.log" Apr 22 15:55:46.866293 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:46.866253 2606 generic.go:358] "Generic (PLEG): container finished" podID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" containerID="7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd" exitCode=0 Apr 22 15:55:46.866762 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:46.866326 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" event={"ID":"f6c1e231-a940-4f4f-88db-1cc7366ee08d","Type":"ContainerDied","Data":"7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd"} Apr 22 15:55:46.866762 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:46.866671 2606 scope.go:117] "RemoveContainer" containerID="7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd" Apr 22 15:55:46.956205 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:46.956154 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4mnwn_must-gather-dx8c5_f6c1e231-a940-4f4f-88db-1cc7366ee08d/gather/0.log" Apr 22 15:55:52.284295 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.284256 2606 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4mnwn/must-gather-dx8c5"] Apr 22 15:55:52.284882 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.284563 2606 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" podUID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" containerName="copy" containerID="cri-o://52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607" gracePeriod=2 Apr 22 15:55:52.285908 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.285883 2606 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4mnwn/must-gather-dx8c5"] Apr 22 15:55:52.286561 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.286537 2606 status_manager.go:895] "Failed to get status for pod" podUID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" err="pods \"must-gather-dx8c5\" is forbidden: User \"system:node:ip-10-0-141-188.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-4mnwn\": no relationship found between node 'ip-10-0-141-188.ec2.internal' and this object" Apr 22 15:55:52.514513 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.514488 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4mnwn_must-gather-dx8c5_f6c1e231-a940-4f4f-88db-1cc7366ee08d/copy/0.log" Apr 22 15:55:52.514850 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.514832 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:55:52.703680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.703582 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq4jz\" (UniqueName: \"kubernetes.io/projected/f6c1e231-a940-4f4f-88db-1cc7366ee08d-kube-api-access-jq4jz\") pod \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\" (UID: \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\") " Apr 22 15:55:52.703680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.703671 2606 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f6c1e231-a940-4f4f-88db-1cc7366ee08d-must-gather-output\") pod \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\" (UID: \"f6c1e231-a940-4f4f-88db-1cc7366ee08d\") " Apr 22 15:55:52.705684 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.705654 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c1e231-a940-4f4f-88db-1cc7366ee08d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f6c1e231-a940-4f4f-88db-1cc7366ee08d" (UID: "f6c1e231-a940-4f4f-88db-1cc7366ee08d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:55:52.705851 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.705829 2606 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c1e231-a940-4f4f-88db-1cc7366ee08d-kube-api-access-jq4jz" (OuterVolumeSpecName: "kube-api-access-jq4jz") pod "f6c1e231-a940-4f4f-88db-1cc7366ee08d" (UID: "f6c1e231-a940-4f4f-88db-1cc7366ee08d"). InnerVolumeSpecName "kube-api-access-jq4jz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:55:52.804677 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.804637 2606 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jq4jz\" (UniqueName: \"kubernetes.io/projected/f6c1e231-a940-4f4f-88db-1cc7366ee08d-kube-api-access-jq4jz\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:55:52.804677 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.804669 2606 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f6c1e231-a940-4f4f-88db-1cc7366ee08d-must-gather-output\") on node \"ip-10-0-141-188.ec2.internal\" DevicePath \"\"" Apr 22 15:55:52.885841 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.885809 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4mnwn_must-gather-dx8c5_f6c1e231-a940-4f4f-88db-1cc7366ee08d/copy/0.log" Apr 22 15:55:52.886137 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.886111 2606 generic.go:358] "Generic (PLEG): container finished" podID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" containerID="52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607" exitCode=143 Apr 22 15:55:52.886220 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.886166 2606 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4mnwn/must-gather-dx8c5" Apr 22 15:55:52.886268 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.886215 2606 scope.go:117] "RemoveContainer" containerID="52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607" Apr 22 15:55:52.894381 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.894347 2606 scope.go:117] "RemoveContainer" containerID="7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd" Apr 22 15:55:52.905878 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.905861 2606 scope.go:117] "RemoveContainer" containerID="52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607" Apr 22 15:55:52.906117 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:55:52.906099 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607\": container with ID starting with 52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607 not found: ID does not exist" containerID="52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607" Apr 22 15:55:52.906154 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.906131 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607"} err="failed to get container status \"52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607\": rpc error: code = NotFound desc = could not find container \"52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607\": container with ID starting with 52fff8907be1e2f0e60d893f5e76c8f47bb59411141699141616f800cf890607 not found: ID does not exist" Apr 22 15:55:52.906194 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.906153 2606 scope.go:117] "RemoveContainer" containerID="7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd" Apr 22 15:55:52.906346 ip-10-0-141-188 kubenswrapper[2606]: E0422 15:55:52.906332 2606 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd\": container with ID starting with 7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd not found: ID does not exist" containerID="7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd" Apr 22 15:55:52.906399 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:52.906351 2606 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd"} err="failed to get container status \"7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd\": rpc error: code = NotFound desc = could not find container \"7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd\": container with ID starting with 7b50049b46a824fb4e675f42aedd5b7783a4d959d20f342c5cd8cdfb2644aacd not found: ID does not exist" Apr 22 15:55:54.415562 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:55:54.415531 2606 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" path="/var/lib/kubelet/pods/f6c1e231-a940-4f4f-88db-1cc7366ee08d/volumes" Apr 22 15:56:06.790686 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:06.790648 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qrz9r_7d0f53d5-b964-4a2e-a21f-f5f4d2a4e281/global-pull-secret-syncer/0.log" Apr 22 15:56:06.859313 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:06.859281 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-s2b7k_6e2a4fe1-ffb7-4e8e-8401-5dae97434c83/konnectivity-agent/0.log" Apr 22 15:56:06.931029 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:06.930997 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-188.ec2.internal_af2477b9db2ffbcd2bd186aed8c8adcf/haproxy/0.log" Apr 22 15:56:10.289224 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.289197 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16fb3581-9c8f-4729-ade0-66a65a61cb4b/alertmanager/0.log" Apr 22 15:56:10.321215 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.321190 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16fb3581-9c8f-4729-ade0-66a65a61cb4b/config-reloader/0.log" Apr 22 15:56:10.343208 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.343185 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16fb3581-9c8f-4729-ade0-66a65a61cb4b/kube-rbac-proxy-web/0.log" Apr 22 15:56:10.374008 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.373986 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16fb3581-9c8f-4729-ade0-66a65a61cb4b/kube-rbac-proxy/0.log" Apr 22 15:56:10.399325 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.399301 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16fb3581-9c8f-4729-ade0-66a65a61cb4b/kube-rbac-proxy-metric/0.log" Apr 22 15:56:10.426093 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.426069 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16fb3581-9c8f-4729-ade0-66a65a61cb4b/prom-label-proxy/0.log" Apr 22 15:56:10.449378 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.449347 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_16fb3581-9c8f-4729-ade0-66a65a61cb4b/init-config-reloader/0.log" Apr 22 15:56:10.513730 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.513696 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f64bz_f131fc27-d3d8-4975-912a-262223f2a995/kube-state-metrics/0.log" Apr 22 15:56:10.534659 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.534633 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f64bz_f131fc27-d3d8-4975-912a-262223f2a995/kube-rbac-proxy-main/0.log" Apr 22 15:56:10.554854 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.554795 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f64bz_f131fc27-d3d8-4975-912a-262223f2a995/kube-rbac-proxy-self/0.log" Apr 22 15:56:10.640147 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.640122 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n4dfk_d3bf30d0-0409-4605-ad12-51f7fa8f533a/node-exporter/0.log" Apr 22 15:56:10.664617 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.664594 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n4dfk_d3bf30d0-0409-4605-ad12-51f7fa8f533a/kube-rbac-proxy/0.log" Apr 22 15:56:10.683736 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.683714 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n4dfk_d3bf30d0-0409-4605-ad12-51f7fa8f533a/init-textfile/0.log" Apr 22 15:56:10.870720 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.870696 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8whlc_12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d/kube-rbac-proxy-main/0.log" Apr 22 15:56:10.890690 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.890668 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8whlc_12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d/kube-rbac-proxy-self/0.log" Apr 22 15:56:10.911579 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.911550 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8whlc_12330f4f-a28c-4e98-afe7-f2ef0bd3cd1d/openshift-state-metrics/0.log" Apr 22 15:56:10.943146 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.943118 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_95b61066-9752-47f7-8f27-f6ea6c8aa282/prometheus/0.log" Apr 22 15:56:10.962101 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.962074 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_95b61066-9752-47f7-8f27-f6ea6c8aa282/config-reloader/0.log" Apr 22 15:56:10.981632 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:10.981592 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_95b61066-9752-47f7-8f27-f6ea6c8aa282/thanos-sidecar/0.log" Apr 22 15:56:11.003167 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:11.003146 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_95b61066-9752-47f7-8f27-f6ea6c8aa282/kube-rbac-proxy-web/0.log" Apr 22 15:56:11.022971 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:11.022931 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_95b61066-9752-47f7-8f27-f6ea6c8aa282/kube-rbac-proxy/0.log" Apr 22 15:56:11.042102 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:11.042083 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_95b61066-9752-47f7-8f27-f6ea6c8aa282/kube-rbac-proxy-thanos/0.log" Apr 22 15:56:11.061478 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:11.061439 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_95b61066-9752-47f7-8f27-f6ea6c8aa282/init-config-reloader/0.log" Apr 22 15:56:11.163093 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:11.163069 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bc59b9f4d-7ch2f_af64188f-395c-4e52-a211-cf0173131bd3/telemeter-client/0.log" Apr 22 15:56:11.184125 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:11.184099 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bc59b9f4d-7ch2f_af64188f-395c-4e52-a211-cf0173131bd3/reload/0.log" Apr 22 15:56:11.205745 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:11.205716 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bc59b9f4d-7ch2f_af64188f-395c-4e52-a211-cf0173131bd3/kube-rbac-proxy/0.log" Apr 22 15:56:14.380441 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:14.380412 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lb9c7_e0a9849c-6b92-4aa1-b14f-9246ef0c29f3/dns/0.log" Apr 22 15:56:14.398814 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:14.398788 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lb9c7_e0a9849c-6b92-4aa1-b14f-9246ef0c29f3/kube-rbac-proxy/0.log" Apr 22 15:56:14.442707 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:14.442679 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9wsxv_002b7b51-3bad-42f0-b2ba-e4180da5923c/dns-node-resolver/0.log" Apr 22 15:56:14.861081 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:14.861047 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2hlwz_d3bf255c-11af-482e-b25f-be50be214aed/node-ca/0.log" Apr 22 15:56:15.095127 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.095086 2606 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6"] Apr 22 15:56:15.095518 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.095498 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" containerName="gather" Apr 22 15:56:15.095518 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.095518 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" containerName="gather" Apr 22 15:56:15.095707 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.095543 2606 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" containerName="copy" Apr 22 15:56:15.095707 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.095551 2606 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" containerName="copy" Apr 22 15:56:15.095707 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.095640 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" containerName="copy" Apr 22 15:56:15.095707 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.095655 2606 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6c1e231-a940-4f4f-88db-1cc7366ee08d" containerName="gather" Apr 22 15:56:15.098671 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.098647 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.101031 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.100961 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xjqzk\"/\"kube-root-ca.crt\"" Apr 22 15:56:15.101153 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.101116 2606 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xjqzk\"/\"openshift-service-ca.crt\"" Apr 22 15:56:15.101962 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.101947 2606 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xjqzk\"/\"default-dockercfg-ct867\"" Apr 22 15:56:15.108952 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.108932 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6"] Apr 22 15:56:15.185644 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.185552 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-podres\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.185644 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.185633 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-proc\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.185840 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.185655 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-lib-modules\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.185840 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.185690 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-sys\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.185840 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.185764 2606 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qw2\" (UniqueName: \"kubernetes.io/projected/d00d16e3-db45-4398-a7d9-a90e6bad6638-kube-api-access-46qw2\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.286826 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.286792 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-sys\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.287036 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.286835 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46qw2\" (UniqueName: \"kubernetes.io/projected/d00d16e3-db45-4398-a7d9-a90e6bad6638-kube-api-access-46qw2\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.287036 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.286858 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-podres\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.287036 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.286926 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-sys\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.287036 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.287000 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-podres\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.287235 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.287041 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-proc\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.287235 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.287064 2606 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-lib-modules\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.287235 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.287129 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-proc\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.287235 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.287199 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d00d16e3-db45-4398-a7d9-a90e6bad6638-lib-modules\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.294287 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.294269 2606 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qw2\" (UniqueName: \"kubernetes.io/projected/d00d16e3-db45-4398-a7d9-a90e6bad6638-kube-api-access-46qw2\") pod \"perf-node-gather-daemonset-dtnh6\" (UID: \"d00d16e3-db45-4398-a7d9-a90e6bad6638\") " pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.409060 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.409026 2606 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.533847 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.531967 2606 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6"] Apr 22 15:56:15.537455 ip-10-0-141-188 kubenswrapper[2606]: W0422 15:56:15.537412 2606 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd00d16e3_db45_4398_a7d9_a90e6bad6638.slice/crio-d50e1bf56550d6b4de5006ef556a24a4fd02f30da96c6501c14af0e8dcc31a85 WatchSource:0}: Error finding container d50e1bf56550d6b4de5006ef556a24a4fd02f30da96c6501c14af0e8dcc31a85: Status 404 returned error can't find the container with id d50e1bf56550d6b4de5006ef556a24a4fd02f30da96c6501c14af0e8dcc31a85 Apr 22 15:56:15.945853 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.945796 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ndg26_b4b5d2c6-67fb-4e8b-b072-9a3d47f86162/serve-healthcheck-canary/0.log" Apr 22 15:56:15.955680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.955648 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" event={"ID":"d00d16e3-db45-4398-a7d9-a90e6bad6638","Type":"ContainerStarted","Data":"017008d4665968105bf05abb6391d860090b36a2d32311632c37455f83faeaee"} Apr 22 15:56:15.955680 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.955681 2606 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" event={"ID":"d00d16e3-db45-4398-a7d9-a90e6bad6638","Type":"ContainerStarted","Data":"d50e1bf56550d6b4de5006ef556a24a4fd02f30da96c6501c14af0e8dcc31a85"} Apr 22 15:56:15.955915 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.955793 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:15.971798 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:15.971753 2606 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" podStartSLOduration=0.971741369 podStartE2EDuration="971.741369ms" podCreationTimestamp="2026-04-22 15:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:56:15.970598376 +0000 UTC m=+2880.084546981" watchObservedRunningTime="2026-04-22 15:56:15.971741369 +0000 UTC m=+2880.085689972" Apr 22 15:56:16.453253 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:16.453222 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xgdfl_8b72d438-c2b7-4709-a0a5-3c11f2a7894e/kube-rbac-proxy/0.log" Apr 22 15:56:16.471463 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:16.471431 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xgdfl_8b72d438-c2b7-4709-a0a5-3c11f2a7894e/exporter/0.log" Apr 22 15:56:16.491295 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:16.491268 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xgdfl_8b72d438-c2b7-4709-a0a5-3c11f2a7894e/extractor/0.log" Apr 22 15:56:21.967884 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:21.967859 2606 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xjqzk/perf-node-gather-daemonset-dtnh6" Apr 22 15:56:22.594691 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.594654 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcx5t_807f61ad-b285-4b1d-b001-650d8ea8f622/kube-multus-additional-cni-plugins/0.log" Apr 22 15:56:22.617375 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.617343 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcx5t_807f61ad-b285-4b1d-b001-650d8ea8f622/egress-router-binary-copy/0.log" Apr 22 15:56:22.637970 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.637946 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcx5t_807f61ad-b285-4b1d-b001-650d8ea8f622/cni-plugins/0.log" Apr 22 15:56:22.656822 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.656800 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcx5t_807f61ad-b285-4b1d-b001-650d8ea8f622/bond-cni-plugin/0.log" Apr 22 15:56:22.675220 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.675201 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcx5t_807f61ad-b285-4b1d-b001-650d8ea8f622/routeoverride-cni/0.log" Apr 22 15:56:22.693738 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.693712 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcx5t_807f61ad-b285-4b1d-b001-650d8ea8f622/whereabouts-cni-bincopy/0.log" Apr 22 15:56:22.712670 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.712646 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcx5t_807f61ad-b285-4b1d-b001-650d8ea8f622/whereabouts-cni/0.log" Apr 22 15:56:22.785441 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.785412 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zlnwc_4cf417d6-b90e-4570-8262-b67044850c51/kube-multus/0.log" Apr 22 15:56:22.804079 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.804054 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-75v74_7069128e-a7fb-43e9-a858-e8e3250b2ac0/network-metrics-daemon/0.log" Apr 22 15:56:22.821340 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:22.821317 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-75v74_7069128e-a7fb-43e9-a858-e8e3250b2ac0/kube-rbac-proxy/0.log" Apr 22 15:56:23.731704 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:23.731679 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-controller/0.log" Apr 22 15:56:23.754711 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:23.754686 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/0.log" Apr 22 15:56:23.769308 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:23.769275 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovn-acl-logging/1.log" Apr 22 15:56:23.788836 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:23.788811 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/kube-rbac-proxy-node/0.log" Apr 22 15:56:23.812542 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:23.812523 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:56:23.832373 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:23.832347 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/northd/0.log" Apr 22 15:56:23.852213 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:23.852184 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/nbdb/0.log" Apr 22 15:56:23.874709 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:23.874679 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/sbdb/0.log" Apr 22 15:56:23.978705 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:23.978681 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ck8h2_d9e225c1-9713-4720-8357-aaf7078a9c2d/ovnkube-controller/0.log" Apr 22 15:56:25.700234 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:25.700205 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-w6z28_9331fcba-cdee-486e-b00b-7bb28c810ab9/network-check-target-container/0.log" Apr 22 15:56:26.594428 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:26.594336 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-f47hr_4158dca8-c8a1-478c-92e2-6eff8a81fd54/iptables-alerter/0.log" Apr 22 15:56:27.236497 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:27.236475 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-w9rvd_723a6368-c49b-450c-9575-bc16b7c8b86f/tuned/0.log" Apr 22 15:56:30.436765 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:30.436733 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-6zbs6_cdcf5e3d-d86e-4839-ab5e-b1b382d5c832/csi-driver/0.log" Apr 22 15:56:30.456340 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:30.456312 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-6zbs6_cdcf5e3d-d86e-4839-ab5e-b1b382d5c832/csi-node-driver-registrar/0.log" Apr 22 15:56:30.477155 ip-10-0-141-188 kubenswrapper[2606]: I0422 15:56:30.477125 2606 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-6zbs6_cdcf5e3d-d86e-4839-ab5e-b1b382d5c832/csi-liveness-probe/0.log"