Apr 21 03:57:56.699542 ip-10-0-138-120 systemd[1]: Starting Kubernetes Kubelet... Apr 21 03:57:57.205032 ip-10-0-138-120 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:57.205032 ip-10-0-138-120 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 03:57:57.205032 ip-10-0-138-120 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:57.205032 ip-10-0-138-120 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 03:57:57.205032 ip-10-0-138-120 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 03:57:57.208116 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.208016 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 03:57:57.215399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215375 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:57.215399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215394 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:57.215399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215399 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:57.215399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215403 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:57.215399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215406 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215409 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215413 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215416 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215419 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215422 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215425 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215427 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215430 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215433 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215436 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215438 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215441 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215444 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215447 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215449 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215452 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215454 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215457 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215459 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:57.215605 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215462 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215466 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215470 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215472 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215475 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215478 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215481 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215491 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215494 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215498 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215501 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215503 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215506 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215508 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215512 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215515 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215518 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215520 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215523 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:57.216096 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215525 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215528 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215530 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215533 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215536 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215538 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215541 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215543 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215546 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215548 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215551 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215554 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215556 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215558 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215561 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215563 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215566 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215568 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215571 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215573 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:57.216653 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215576 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215579 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215581 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215583 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215586 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215589 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215591 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215595 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215597 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215601 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215603 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215606 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215610 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215614 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215617 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215620 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215623 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215626 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215629 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215631 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:57.217140 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215634 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215636 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.215639 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216033 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216038 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216042 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216045 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216048 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216050 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216061 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216065 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216068 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216071 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216074 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216077 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216079 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216082 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216084 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216087 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216090 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:57.217644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216093 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216096 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216099 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216101 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216104 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216107 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216109 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216113 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216115 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216118 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216120 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216123 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216125 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216128 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216130 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216132 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216135 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216137 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216140 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216142 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:57.218114 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216144 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216147 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216151 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216154 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216157 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216160 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216163 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216165 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216168 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216170 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216173 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216177 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216180 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216190 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216193 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216196 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216199 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216201 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216204 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:57.218630 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216207 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216210 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216212 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216215 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216218 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216220 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216223 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216225 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216228 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216230 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216233 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216253 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216256 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216258 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216261 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216263 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216266 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216268 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216271 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216274 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:57.219137 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216276 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216280 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216283 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216285 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216289 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216292 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216294 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216297 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216299 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.216302 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218254 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218264 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218270 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218275 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218281 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218285 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218289 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218293 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218297 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218300 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218303 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218307 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 03:57:57.219644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218311 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218313 2569 flags.go:64] FLAG: --cgroup-root="" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218316 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218319 2569 flags.go:64] FLAG: --client-ca-file="" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218322 2569 flags.go:64] FLAG: --cloud-config="" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218325 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218328 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218333 2569 flags.go:64] FLAG: --cluster-domain="" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218336 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218339 2569 flags.go:64] FLAG: --config-dir="" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218342 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218345 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218349 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218352 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218356 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218359 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218363 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218366 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218368 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218372 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218374 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218379 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218382 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218385 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218387 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 03:57:57.220187 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218392 2569 flags.go:64] FLAG: --enable-server="true" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218395 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218399 2569 flags.go:64] FLAG: --event-burst="100" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218402 2569 flags.go:64] FLAG: --event-qps="50" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218406 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218409 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218412 2569 flags.go:64] FLAG: --eviction-hard="" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218416 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218418 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218421 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218424 2569 flags.go:64] FLAG: --eviction-soft="" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218427 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218431 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218434 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218437 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218440 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218443 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218446 2569 flags.go:64] FLAG: --feature-gates="" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218449 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218452 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218456 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218459 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218462 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218465 2569 flags.go:64] FLAG: --help="false" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218468 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.220809 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218471 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218474 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218477 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218481 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218484 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218487 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218490 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218492 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218496 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218499 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218503 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218506 2569 flags.go:64] FLAG: --kube-reserved="" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218509 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218512 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218515 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218518 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218521 2569 flags.go:64] FLAG: --lock-file="" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218523 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218526 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218529 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218535 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218537 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218540 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 03:57:57.221417 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218543 2569 flags.go:64] FLAG: --logging-format="text" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218546 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218549 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218552 2569 flags.go:64] FLAG: --manifest-url="" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218555 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218560 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218563 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218568 2569 flags.go:64] FLAG: --max-pods="110" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218571 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218574 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218577 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218580 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218582 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218585 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218588 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218596 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218599 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218602 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218605 2569 flags.go:64] FLAG: --pod-cidr="" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218608 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218614 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218617 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218620 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218623 2569 flags.go:64] FLAG: --port="10250" Apr 21 03:57:57.221962 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218626 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218629 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fa2494bf0248176a" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218632 2569 flags.go:64] FLAG: --qos-reserved="" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218635 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218639 2569 flags.go:64] FLAG: --register-node="true" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218641 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218644 2569 flags.go:64] FLAG: --register-with-taints="" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218648 2569 flags.go:64] FLAG: --registry-burst="10" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218651 2569 flags.go:64] FLAG: --registry-qps="5" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218654 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218656 2569 flags.go:64] FLAG: --reserved-memory="" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218660 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218663 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218667 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218670 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218673 2569 flags.go:64] FLAG: --runonce="false" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218676 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218679 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218682 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218685 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218688 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218691 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218693 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218696 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218699 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218702 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 03:57:57.222630 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218705 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218708 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218712 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218714 2569 flags.go:64] FLAG: --system-cgroups="" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218717 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218723 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218725 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218728 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218733 2569 flags.go:64] FLAG: --tls-min-version="" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218736 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218739 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218742 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218744 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218747 2569 flags.go:64] FLAG: --v="2" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218752 2569 flags.go:64] FLAG: --version="false" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218756 2569 flags.go:64] FLAG: --vmodule="" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218760 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.218763 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218863 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218867 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218870 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218873 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218876 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218879 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:57.223267 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218881 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218883 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218886 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218888 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218891 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218893 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218896 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218898 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218901 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218903 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218906 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218909 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218911 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218914 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218916 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218918 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218921 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218923 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218926 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218929 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:57.223841 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218931 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218934 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218936 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218939 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218943 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218946 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218949 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218952 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218954 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218957 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218959 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218962 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218964 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218966 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218969 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218971 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218974 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218976 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218978 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:57.224403 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218981 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218983 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218986 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218988 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218991 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218993 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218996 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.218998 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219000 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219003 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219005 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219010 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219013 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219015 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219018 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219020 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219022 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219025 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219027 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219030 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:57.224879 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219032 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219035 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219037 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219040 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219042 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219045 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219047 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219049 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219052 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219054 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219058 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219061 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219064 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219066 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219069 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219071 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219074 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219077 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219079 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:57.225409 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219082 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:57.225866 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.219084 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:57.225866 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.219842 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:57.226647 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.226628 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 03:57:57.226684 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.226648 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 03:57:57.226717 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226697 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:57.226717 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226703 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:57.226717 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226706 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:57.226717 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226709 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:57.226717 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226712 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:57.226717 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226715 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:57.226717 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226718 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:57.226717 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226721 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226724 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226727 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226729 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226732 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226734 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226737 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226739 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226742 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226744 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226747 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226749 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226751 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226754 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226757 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226759 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226762 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226764 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226771 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:57.226917 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226775 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226778 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226781 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226783 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226786 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226789 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226791 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226794 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226796 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226799 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226801 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226804 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226807 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226809 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226812 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226815 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226818 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226820 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226823 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226827 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:57.227398 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226830 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226833 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226835 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226838 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226840 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226843 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226846 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226848 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226851 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226853 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226856 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226858 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226861 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226864 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226866 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226869 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226871 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226874 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226876 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226879 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:57.227892 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226881 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226884 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226886 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226889 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226891 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226894 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226897 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226900 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226902 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226904 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226908 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226911 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226913 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226916 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226919 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226921 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226924 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226927 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226930 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:57.228399 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.226932 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.226937 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227041 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227047 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227050 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227054 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227064 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227067 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227070 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227073 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227075 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227077 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227080 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227082 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227085 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 03:57:57.228897 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227087 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227089 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227092 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227094 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227097 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227100 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227103 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227105 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227108 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227110 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227113 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227115 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227118 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227121 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227125 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227127 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227130 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227133 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227135 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227138 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 03:57:57.229284 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227140 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227143 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227145 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227148 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227151 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227153 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227156 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227158 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227161 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227163 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227165 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227168 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227170 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227172 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227175 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227177 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227179 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227182 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227185 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227188 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 03:57:57.229770 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227190 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227193 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227195 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227198 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227200 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227203 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227205 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227208 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227210 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227212 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227215 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227217 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227219 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227222 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227224 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227226 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227229 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227254 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227258 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 03:57:57.230283 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227260 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227263 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227265 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227268 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227271 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227273 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227275 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227278 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227280 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227283 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227286 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227288 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227291 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:57.227293 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.227298 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 03:57:57.230741 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.228134 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 03:57:57.231104 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.230323 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 03:57:57.231632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.231620 2569 server.go:1019] "Starting client certificate rotation" Apr 21 03:57:57.231731 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.231715 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:57:57.231770 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.231761 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 03:57:57.259890 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.259871 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:57:57.262802 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.262785 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 03:57:57.277552 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.277530 2569 log.go:25] "Validated CRI v1 runtime API" Apr 21 03:57:57.283947 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.283931 2569 log.go:25] "Validated CRI v1 image API" Apr 21 03:57:57.285290 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.285264 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 03:57:57.285568 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.285553 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:57:57.290656 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.290636 2569 fs.go:135] Filesystem UUIDs: map[553961b4-3fab-4738-b4f0-dfc980b84267:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 d7391bd9-0ad7-4ee8-85c1-bed387408475:/dev/nvme0n1p3] Apr 21 03:57:57.290709 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.290657 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 03:57:57.295636 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.295518 2569 manager.go:217] Machine: {Timestamp:2026-04-21 03:57:57.294350442 +0000 UTC m=+0.461746870 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097614 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec213384c9d06970da008af6153a7253 SystemUUID:ec213384-c9d0-6970-da00-8af6153a7253 BootID:110c06be-14ff-42c6-9c9e-a705cff430de Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b5:7c:a6:89:b1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b5:7c:a6:89:b1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:91:2a:fa:73:e6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 03:57:57.295636 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.295631 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 03:57:57.295749 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.295710 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 03:57:57.297101 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.297071 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 03:57:57.297314 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.297102 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-120.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 03:57:57.297393 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.297328 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 03:57:57.297393 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.297340 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 03:57:57.297393 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.297358 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:57:57.297393 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.297376 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 03:57:57.298339 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.298326 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:57:57.298460 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.298449 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 03:57:57.301360 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.301349 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 21 03:57:57.301426 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.301395 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 03:57:57.302391 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.302381 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 03:57:57.302441 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.302398 2569 kubelet.go:397] "Adding apiserver pod source" Apr 21 03:57:57.302441 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.302411 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 03:57:57.303718 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.303705 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:57:57.303833 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.303728 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 03:57:57.307168 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.307151 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 03:57:57.308609 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.308596 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 03:57:57.310336 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310319 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dzbsg" Apr 21 03:57:57.310593 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310581 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310601 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310608 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310613 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310621 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310630 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310639 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310646 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310656 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310662 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 03:57:57.310673 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310671 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 03:57:57.310922 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.310680 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 03:57:57.311749 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.311730 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-120.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 03:57:57.311851 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.311837 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 03:57:57.311896 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.311843 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 03:57:57.311896 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.311855 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 03:57:57.314044 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.314031 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-120.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 03:57:57.315616 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.315604 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 03:57:57.315659 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.315641 2569 server.go:1295] "Started kubelet" Apr 21 03:57:57.315744 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.315723 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 03:57:57.315918 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.315870 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 03:57:57.315982 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.315939 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 03:57:57.316061 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.316036 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dzbsg" Apr 21 03:57:57.316589 ip-10-0-138-120 systemd[1]: Started Kubernetes Kubelet. Apr 21 03:57:57.317212 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.317195 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 03:57:57.321531 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.321510 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 21 03:57:57.324124 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.322110 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-120.ec2.internal.18a84324903f5c50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-120.ec2.internal,UID:ip-10-0-138-120.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-120.ec2.internal,},FirstTimestamp:2026-04-21 03:57:57.315615824 +0000 UTC m=+0.483012253,LastTimestamp:2026-04-21 03:57:57.315615824 +0000 UTC m=+0.483012253,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-120.ec2.internal,}" Apr 21 03:57:57.325372 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.325354 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 03:57:57.326698 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.325889 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 03:57:57.327106 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.327082 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 03:57:57.327189 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.327108 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 03:57:57.327269 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.327229 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 03:57:57.327388 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.327367 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 21 03:57:57.327388 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.327382 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 21 03:57:57.327603 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.327586 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:57.327925 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.327902 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 03:57:57.328004 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.327933 2569 factory.go:55] Registering systemd factory Apr 21 03:57:57.328004 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.327955 2569 factory.go:223] Registration of the systemd container factory successfully Apr 21 03:57:57.328189 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.328177 2569 factory.go:153] Registering CRI-O factory Apr 21 03:57:57.328189 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.328191 2569 factory.go:223] Registration of the crio container factory successfully Apr 21 03:57:57.328333 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.328231 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 03:57:57.328333 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.328277 2569 factory.go:103] Registering Raw factory Apr 21 03:57:57.328333 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.328291 2569 manager.go:1196] Started watching for new ooms in manager Apr 21 03:57:57.328918 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.328903 2569 manager.go:319] Starting recovery of all containers Apr 21 03:57:57.338192 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.338014 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:57.339755 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.339739 2569 manager.go:324] Recovery completed Apr 21 03:57:57.340862 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.340846 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-120.ec2.internal\" not found" node="ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.344510 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.344497 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:57.347030 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.347015 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:57.347101 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.347043 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:57.347101 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.347054 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:57.347595 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.347582 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 03:57:57.347595 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.347593 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 03:57:57.347698 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.347611 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 21 03:57:57.350037 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.350025 2569 policy_none.go:49] "None policy: Start" Apr 21 03:57:57.350079 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.350041 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 03:57:57.350079 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.350051 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 21 03:57:57.386834 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.386817 2569 manager.go:341] "Starting Device Plugin manager" Apr 21 03:57:57.401528 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.386852 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 03:57:57.401528 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.386863 2569 server.go:85] "Starting device plugin registration server" Apr 21 03:57:57.401528 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.387088 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 03:57:57.401528 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.387101 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 03:57:57.401528 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.387290 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 03:57:57.401528 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.387363 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 03:57:57.401528 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.387374 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 03:57:57.401528 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.387966 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 03:57:57.401528 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.388007 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:57.463946 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.463896 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 03:57:57.465288 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.465267 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 03:57:57.465369 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.465292 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 03:57:57.465369 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.465310 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 03:57:57.465369 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.465316 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 03:57:57.465369 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.465348 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 03:57:57.467515 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.467497 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:57.488547 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.488511 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:57.489312 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.489297 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:57.489398 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.489331 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:57.489398 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.489354 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:57.489398 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.489386 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.498030 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.498016 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.498108 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.498037 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-120.ec2.internal\": node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:57.521708 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.521681 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:57.565956 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.565913 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal"] Apr 21 03:57:57.566060 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.566018 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:57.566919 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.566903 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:57.567008 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.566929 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:57.567008 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.566939 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:57.568522 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.568508 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:57.568712 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.568696 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.568779 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.568731 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:57.569151 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.569134 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:57.569151 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.569144 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:57.569269 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.569170 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:57.569269 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.569186 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:57.569269 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.569170 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:57.569368 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.569275 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:57.570647 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.570625 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.570745 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.570654 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 03:57:57.571312 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.571296 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientMemory" Apr 21 03:57:57.571384 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.571324 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 03:57:57.571384 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.571334 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeHasSufficientPID" Apr 21 03:57:57.597939 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.597916 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-120.ec2.internal\" not found" node="ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.602095 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.602081 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-120.ec2.internal\" not found" node="ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.622288 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.622268 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:57.629274 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.629258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.629340 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.629287 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.629340 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.629305 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c5556f6af36052a906fa0aef20bfb6c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-120.ec2.internal\" (UID: \"4c5556f6af36052a906fa0aef20bfb6c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.723154 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.723081 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:57.729470 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.729446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.729534 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.729481 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c5556f6af36052a906fa0aef20bfb6c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-120.ec2.internal\" (UID: \"4c5556f6af36052a906fa0aef20bfb6c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.729534 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.729497 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.729625 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.729533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.729625 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.729561 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c5556f6af36052a906fa0aef20bfb6c-config\") pod \"kube-apiserver-proxy-ip-10-0-138-120.ec2.internal\" (UID: \"4c5556f6af36052a906fa0aef20bfb6c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.729625 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.729564 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba5550cc00f884a90499316ee8207508-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal\" (UID: \"ba5550cc00f884a90499316ee8207508\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.823886 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.823845 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:57.900369 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.900335 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.905165 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:57.905145 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 21 03:57:57.924112 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:57.924087 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:58.024873 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:58.024783 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:58.125303 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:58.125273 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:58.145822 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.145800 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:58.225704 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:58.225671 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:58.231964 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.231947 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 03:57:58.232098 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.232080 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:58.232145 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.232113 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:58.232181 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.232116 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 03:57:58.321106 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.321035 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 03:52:57 +0000 UTC" deadline="2028-01-14 00:35:55.34148696 +0000 UTC" Apr 21 03:57:58.321106 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.321067 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15188h37m57.020423244s" Apr 21 03:57:58.326211 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:58.326184 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:58.326342 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.326220 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 03:57:58.336172 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.336144 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 03:57:58.352663 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.352636 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xvdbc" Apr 21 03:57:58.360321 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.360302 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xvdbc" Apr 21 03:57:58.368642 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:58.368615 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c5556f6af36052a906fa0aef20bfb6c.slice/crio-3d8dba41b00afde67d8d77c01ce9d2c49e6db33029ff7a6cd20c24d86cc2ca50 WatchSource:0}: Error finding container 3d8dba41b00afde67d8d77c01ce9d2c49e6db33029ff7a6cd20c24d86cc2ca50: Status 404 returned error can't find the container with id 3d8dba41b00afde67d8d77c01ce9d2c49e6db33029ff7a6cd20c24d86cc2ca50 Apr 21 03:57:58.369074 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:57:58.369058 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba5550cc00f884a90499316ee8207508.slice/crio-faaf394d6545f98d3806d5845f0a8ec50ccac375eb6022dfa421a9ac84a046f8 WatchSource:0}: Error finding container faaf394d6545f98d3806d5845f0a8ec50ccac375eb6022dfa421a9ac84a046f8: Status 404 returned error can't find the container with id faaf394d6545f98d3806d5845f0a8ec50ccac375eb6022dfa421a9ac84a046f8 Apr 21 03:57:58.373073 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.373054 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 03:57:58.426429 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:58.426389 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:58.468568 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.468521 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" event={"ID":"ba5550cc00f884a90499316ee8207508","Type":"ContainerStarted","Data":"faaf394d6545f98d3806d5845f0a8ec50ccac375eb6022dfa421a9ac84a046f8"} Apr 21 03:57:58.469350 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.469331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" event={"ID":"4c5556f6af36052a906fa0aef20bfb6c","Type":"ContainerStarted","Data":"3d8dba41b00afde67d8d77c01ce9d2c49e6db33029ff7a6cd20c24d86cc2ca50"} Apr 21 03:57:58.526540 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:58.526510 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:58.626920 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:58.626891 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:58.727353 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:58.727327 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-120.ec2.internal\" not found" Apr 21 03:57:58.727602 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.727586 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:58.827860 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.827827 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" Apr 21 03:57:58.839645 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.839622 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:58.840828 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.840792 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" Apr 21 03:57:58.848834 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:58.848812 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 03:57:59.168819 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.168787 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:59.246438 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.246412 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:57:59.303706 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.303681 2569 apiserver.go:52] "Watching apiserver" Apr 21 03:57:59.310507 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.310482 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 03:57:59.312093 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.312066 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg","openshift-multus/network-metrics-daemon-cxzzc","openshift-network-operator/iptables-alerter-zfsw2","openshift-ovn-kubernetes/ovnkube-node-bknrd","kube-system/konnectivity-agent-fs2vm","openshift-cluster-node-tuning-operator/tuned-k9mlc","openshift-dns/node-resolver-r5zq9","openshift-image-registry/node-ca-bw45p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal","openshift-multus/multus-additional-cni-plugins-wfmzk","openshift-multus/multus-lhbpw","openshift-network-diagnostics/network-check-target-vsvml","kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal"] Apr 21 03:57:59.314838 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.314815 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.315998 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.315974 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:57:59.316182 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.316081 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:57:59.316182 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.316097 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.317230 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.317178 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 03:57:59.317352 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.317178 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 03:57:59.317410 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.317370 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mplvf\"" Apr 21 03:57:59.318377 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.318357 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 03:57:59.318377 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.318366 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:59.318969 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.318950 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:59.319111 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.319098 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6cd8b\"" Apr 21 03:57:59.319893 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.319869 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:57:59.322434 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.322410 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.322574 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.322562 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.323624 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.323339 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 03:57:59.323624 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.323447 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 03:57:59.323624 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.323508 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-p2fp7\"" Apr 21 03:57:59.324636 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.324617 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:57:59.324981 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.324965 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 03:57:59.325674 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.325649 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lh26x\"" Apr 21 03:57:59.325767 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.325698 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 03:57:59.325767 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.325711 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 03:57:59.325870 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.325775 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 03:57:59.326029 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.326001 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.326140 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.326083 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 03:57:59.326140 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.326089 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4f8rx\"" Apr 21 03:57:59.326270 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.326092 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 03:57:59.326270 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.326012 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 03:57:59.327497 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.327479 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.327724 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.327708 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.328135 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.328116 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 03:57:59.328224 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.328145 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 03:57:59.328499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.328483 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zkdlh\"" Apr 21 03:57:59.328693 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.328676 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 03:57:59.329127 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.329107 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.329720 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.329699 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bsm59\"" Apr 21 03:57:59.329800 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.329742 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 03:57:59.329800 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.329785 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 03:57:59.329952 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.329936 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 03:57:59.330106 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.330030 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 03:57:59.330175 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.330146 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 03:57:59.330265 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.330229 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 03:57:59.330508 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.330491 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 03:57:59.330600 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.330565 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6l8s2\"" Apr 21 03:57:59.330655 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.330631 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 03:57:59.330929 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.330914 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:57:59.331010 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.330987 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:57:59.332282 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.332110 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-zlc75\"" Apr 21 03:57:59.332282 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.332171 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 03:57:59.336782 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.336761 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-kubelet\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.336887 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.336794 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-etc-openvswitch\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.336887 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.336820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5c3da277-f4fb-47af-9b13-a2de86f37142-konnectivity-ca\") pod \"konnectivity-agent-fs2vm\" (UID: \"5c3da277-f4fb-47af-9b13-a2de86f37142\") " pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:57:59.336887 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.336840 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/24c30e95-1770-4975-87be-9b1494d8904c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.336887 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.336858 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/276dfbb7-a93a-4da8-8b3b-f919c9642bca-tmp-dir\") pod \"node-resolver-r5zq9\" (UID: \"276dfbb7-a93a-4da8-8b3b-f919c9642bca\") " pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.336887 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.336881 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8tmr\" (UniqueName: \"kubernetes.io/projected/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-kube-api-access-k8tmr\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.337143 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.336914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-registration-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.337143 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.336969 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chhh5\" (UniqueName: \"kubernetes.io/projected/276dfbb7-a93a-4da8-8b3b-f919c9642bca-kube-api-access-chhh5\") pod \"node-resolver-r5zq9\" (UID: \"276dfbb7-a93a-4da8-8b3b-f919c9642bca\") " pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.337143 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.336994 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-slash\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.337143 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-run-ovn\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.337143 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337048 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7eb134d5-89c6-46e0-ae15-02b2684b117a-ovnkube-config\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.337143 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-sysconfig\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.337143 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337108 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-etc-selinux\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337161 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5c3da277-f4fb-47af-9b13-a2de86f37142-agent-certs\") pod \"konnectivity-agent-fs2vm\" (UID: \"5c3da277-f4fb-47af-9b13-a2de86f37142\") " pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337185 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-sysctl-d\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337207 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-cnibin\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-cni-netd\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337280 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-socket-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337298 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhl5j\" (UniqueName: \"kubernetes.io/projected/24c30e95-1770-4975-87be-9b1494d8904c-kube-api-access-mhl5j\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337316 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/229b10ec-c401-4070-945d-fa92e56f6443-host-slash\") pod \"iptables-alerter-zfsw2\" (UID: \"229b10ec-c401-4070-945d-fa92e56f6443\") " pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337337 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-var-lib-openvswitch\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7eb134d5-89c6-46e0-ae15-02b2684b117a-ovnkube-script-lib\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-sysctl-conf\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337394 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-run\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-node-log\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337439 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7eb134d5-89c6-46e0-ae15-02b2684b117a-env-overrides\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7eb134d5-89c6-46e0-ae15-02b2684b117a-ovn-node-metrics-cert\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.337512 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337501 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-tuned\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-device-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337579 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9lf\" (UniqueName: \"kubernetes.io/projected/7fbebf58-6bc4-4d05-8b31-3098996af4db-kube-api-access-vq9lf\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-os-release\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337624 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-run-systemd\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337647 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-log-socket\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337680 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337715 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhphc\" (UniqueName: \"kubernetes.io/projected/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-kube-api-access-lhphc\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337740 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e7f6204a-5b3d-4f0f-8b89-3111e460af8a-host\") pod \"node-ca-bw45p\" (UID: \"e7f6204a-5b3d-4f0f-8b89-3111e460af8a\") " pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e7f6204a-5b3d-4f0f-8b89-3111e460af8a-serviceca\") pod \"node-ca-bw45p\" (UID: \"e7f6204a-5b3d-4f0f-8b89-3111e460af8a\") " pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337814 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-run-netns\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337832 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-kubernetes\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337849 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24c30e95-1770-4975-87be-9b1494d8904c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337902 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/229b10ec-c401-4070-945d-fa92e56f6443-iptables-alerter-script\") pod \"iptables-alerter-zfsw2\" (UID: \"229b10ec-c401-4070-945d-fa92e56f6443\") " pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337946 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-tmp\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337974 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-sys-fs\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.338200 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.337999 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qj6z\" (UniqueName: \"kubernetes.io/projected/7eb134d5-89c6-46e0-ae15-02b2684b117a-kube-api-access-8qj6z\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338037 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-systemd\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338061 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-lib-modules\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338094 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-host\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338143 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338185 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-system-cni-dir\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/276dfbb7-a93a-4da8-8b3b-f919c9642bca-hosts-file\") pod \"node-resolver-r5zq9\" (UID: \"276dfbb7-a93a-4da8-8b3b-f919c9642bca\") " pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338257 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-cni-bin\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338281 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24c30e95-1770-4975-87be-9b1494d8904c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338345 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxdf\" (UniqueName: \"kubernetes.io/projected/e7f6204a-5b3d-4f0f-8b89-3111e460af8a-kube-api-access-gdxdf\") pod \"node-ca-bw45p\" (UID: \"e7f6204a-5b3d-4f0f-8b89-3111e460af8a\") " pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338432 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338459 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns5hs\" (UniqueName: \"kubernetes.io/projected/229b10ec-c401-4070-945d-fa92e56f6443-kube-api-access-ns5hs\") pod \"iptables-alerter-zfsw2\" (UID: \"229b10ec-c401-4070-945d-fa92e56f6443\") " pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338503 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-run-openvswitch\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338543 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-run-ovn-kubernetes\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338573 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-modprobe-d\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.338899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338612 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-sys\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.339658 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338636 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-var-lib-kubelet\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.339658 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.338668 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-systemd-units\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.361279 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.361231 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:58 +0000 UTC" deadline="2027-12-02 09:26:29.422981309 +0000 UTC" Apr 21 03:57:59.361375 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.361279 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14165h28m30.061706058s" Apr 21 03:57:59.429134 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.429067 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 03:57:59.439331 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439300 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5c3da277-f4fb-47af-9b13-a2de86f37142-konnectivity-ca\") pod \"konnectivity-agent-fs2vm\" (UID: \"5c3da277-f4fb-47af-9b13-a2de86f37142\") " pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:57:59.439443 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439337 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/24c30e95-1770-4975-87be-9b1494d8904c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.439443 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439363 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/276dfbb7-a93a-4da8-8b3b-f919c9642bca-tmp-dir\") pod \"node-resolver-r5zq9\" (UID: \"276dfbb7-a93a-4da8-8b3b-f919c9642bca\") " pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.439443 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-system-cni-dir\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.439443 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439413 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8tmr\" (UniqueName: \"kubernetes.io/projected/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-kube-api-access-k8tmr\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.439443 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439437 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-registration-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.439681 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439459 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chhh5\" (UniqueName: \"kubernetes.io/projected/276dfbb7-a93a-4da8-8b3b-f919c9642bca-kube-api-access-chhh5\") pod \"node-resolver-r5zq9\" (UID: \"276dfbb7-a93a-4da8-8b3b-f919c9642bca\") " pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.439681 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439484 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-slash\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.439681 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439507 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-run-ovn\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.439681 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439573 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7eb134d5-89c6-46e0-ae15-02b2684b117a-ovnkube-config\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.439681 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-var-lib-kubelet\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.439681 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439625 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-hostroot\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.439681 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439650 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-sysconfig\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439689 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-etc-selinux\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439715 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-etc-kubernetes\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jx82\" (UniqueName: \"kubernetes.io/projected/64236d25-036a-4831-9f0c-63b1efd05cc1-kube-api-access-4jx82\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439767 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5c3da277-f4fb-47af-9b13-a2de86f37142-agent-certs\") pod \"konnectivity-agent-fs2vm\" (UID: \"5c3da277-f4fb-47af-9b13-a2de86f37142\") " pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439793 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-sysctl-d\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-cnibin\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439843 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-cni-netd\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439867 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-socket-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5c3da277-f4fb-47af-9b13-a2de86f37142-konnectivity-ca\") pod \"konnectivity-agent-fs2vm\" (UID: \"5c3da277-f4fb-47af-9b13-a2de86f37142\") " pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhl5j\" (UniqueName: \"kubernetes.io/projected/24c30e95-1770-4975-87be-9b1494d8904c-kube-api-access-mhl5j\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/229b10ec-c401-4070-945d-fa92e56f6443-host-slash\") pod \"iptables-alerter-zfsw2\" (UID: \"229b10ec-c401-4070-945d-fa92e56f6443\") " pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-var-lib-openvswitch\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439968 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7eb134d5-89c6-46e0-ae15-02b2684b117a-ovnkube-script-lib\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.440000 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.439994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-sysctl-conf\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440017 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-run\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440042 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-node-log\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440075 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7eb134d5-89c6-46e0-ae15-02b2684b117a-env-overrides\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7eb134d5-89c6-46e0-ae15-02b2684b117a-ovn-node-metrics-cert\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440163 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-run-k8s-cni-cncf-io\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-run-netns\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440221 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-tuned\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440267 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-device-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440301 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9lf\" (UniqueName: \"kubernetes.io/projected/7fbebf58-6bc4-4d05-8b31-3098996af4db-kube-api-access-vq9lf\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440332 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-os-release\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440362 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-run-systemd\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440377 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/24c30e95-1770-4975-87be-9b1494d8904c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440398 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64236d25-036a-4831-9f0c-63b1efd05cc1-cni-binary-copy\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440425 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-log-socket\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440454 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440479 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhphc\" (UniqueName: \"kubernetes.io/projected/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-kube-api-access-lhphc\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:57:59.440682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440506 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e7f6204a-5b3d-4f0f-8b89-3111e460af8a-host\") pod \"node-ca-bw45p\" (UID: \"e7f6204a-5b3d-4f0f-8b89-3111e460af8a\") " pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e7f6204a-5b3d-4f0f-8b89-3111e460af8a-serviceca\") pod \"node-ca-bw45p\" (UID: \"e7f6204a-5b3d-4f0f-8b89-3111e460af8a\") " pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440583 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-run-netns\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-kubernetes\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440624 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/276dfbb7-a93a-4da8-8b3b-f919c9642bca-tmp-dir\") pod \"node-resolver-r5zq9\" (UID: \"276dfbb7-a93a-4da8-8b3b-f919c9642bca\") " pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24c30e95-1770-4975-87be-9b1494d8904c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440659 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/229b10ec-c401-4070-945d-fa92e56f6443-iptables-alerter-script\") pod \"iptables-alerter-zfsw2\" (UID: \"229b10ec-c401-4070-945d-fa92e56f6443\") " pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440687 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-cni-dir\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440711 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-cnibin\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440716 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-registration-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-tmp\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-sys-fs\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440795 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qj6z\" (UniqueName: \"kubernetes.io/projected/7eb134d5-89c6-46e0-ae15-02b2684b117a-kube-api-access-8qj6z\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-socket-dir-parent\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440856 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-daemon-config\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440885 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-run-multus-certs\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440912 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:57:59.441538 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-systemd\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440952 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-slash\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-lib-modules\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.440997 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-run-ovn\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441009 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-host\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441041 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441068 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-system-cni-dir\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/276dfbb7-a93a-4da8-8b3b-f919c9642bca-hosts-file\") pod \"node-resolver-r5zq9\" (UID: \"276dfbb7-a93a-4da8-8b3b-f919c9642bca\") " pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441121 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-cni-bin\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441148 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24c30e95-1770-4975-87be-9b1494d8904c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-var-lib-cni-multus\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-conf-dir\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxdf\" (UniqueName: \"kubernetes.io/projected/e7f6204a-5b3d-4f0f-8b89-3111e460af8a-kube-api-access-gdxdf\") pod \"node-ca-bw45p\" (UID: \"e7f6204a-5b3d-4f0f-8b89-3111e460af8a\") " pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441338 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ns5hs\" (UniqueName: \"kubernetes.io/projected/229b10ec-c401-4070-945d-fa92e56f6443-kube-api-access-ns5hs\") pod \"iptables-alerter-zfsw2\" (UID: \"229b10ec-c401-4070-945d-fa92e56f6443\") " pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441364 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-run-openvswitch\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442320 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-run-ovn-kubernetes\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441418 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-os-release\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441447 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-modprobe-d\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-sys\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7eb134d5-89c6-46e0-ae15-02b2684b117a-ovnkube-config\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441508 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-var-lib-kubelet\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441557 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-kubernetes\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-systemd-units\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-var-lib-cni-bin\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441610 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-sysconfig\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441628 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-kubelet\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441656 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-etc-openvswitch\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-etc-selinux\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441745 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-etc-openvswitch\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.441989 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/276dfbb7-a93a-4da8-8b3b-f919c9642bca-hosts-file\") pod \"node-resolver-r5zq9\" (UID: \"276dfbb7-a93a-4da8-8b3b-f919c9642bca\") " pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442233 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24c30e95-1770-4975-87be-9b1494d8904c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442326 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-cni-bin\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.442905 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442372 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-sysctl-d\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442425 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-cnibin\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-cni-netd\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/229b10ec-c401-4070-945d-fa92e56f6443-iptables-alerter-script\") pod \"iptables-alerter-zfsw2\" (UID: \"229b10ec-c401-4070-945d-fa92e56f6443\") " pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442552 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-socket-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442707 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/229b10ec-c401-4070-945d-fa92e56f6443-host-slash\") pod \"iptables-alerter-zfsw2\" (UID: \"229b10ec-c401-4070-945d-fa92e56f6443\") " pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.442745 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-var-lib-openvswitch\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443104 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-sys-fs\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443204 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-sysctl-conf\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-systemd\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443260 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-run\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7eb134d5-89c6-46e0-ae15-02b2684b117a-ovnkube-script-lib\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443285 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-node-log\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443305 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-run-openvswitch\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443332 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-run-ovn-kubernetes\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443341 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-log-socket\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.443413 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:59.443632 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-lib-modules\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.443481 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs podName:03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3 nodeName:}" failed. No retries permitted until 2026-04-21 03:57:59.943450499 +0000 UTC m=+3.110846932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs") pod "network-metrics-daemon-cxzzc" (UID: "03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443478 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443512 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-host\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443523 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-var-lib-kubelet\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-system-cni-dir\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443589 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-os-release\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443584 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24c30e95-1770-4975-87be-9b1494d8904c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443590 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-modprobe-d\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7eb134d5-89c6-46e0-ae15-02b2684b117a-env-overrides\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443619 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-run-netns\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-systemd-units\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-run-systemd\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443698 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-sys\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443729 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7eb134d5-89c6-46e0-ae15-02b2684b117a-host-kubelet\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.443830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7fbebf58-6bc4-4d05-8b31-3098996af4db-device-dir\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.444138 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e7f6204a-5b3d-4f0f-8b89-3111e460af8a-host\") pod \"node-ca-bw45p\" (UID: \"e7f6204a-5b3d-4f0f-8b89-3111e460af8a\") " pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.444495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.444283 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24c30e95-1770-4975-87be-9b1494d8904c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.445360 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.444471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e7f6204a-5b3d-4f0f-8b89-3111e460af8a-serviceca\") pod \"node-ca-bw45p\" (UID: \"e7f6204a-5b3d-4f0f-8b89-3111e460af8a\") " pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.446154 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.446125 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5c3da277-f4fb-47af-9b13-a2de86f37142-agent-certs\") pod \"konnectivity-agent-fs2vm\" (UID: \"5c3da277-f4fb-47af-9b13-a2de86f37142\") " pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:57:59.446483 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.446399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7eb134d5-89c6-46e0-ae15-02b2684b117a-ovn-node-metrics-cert\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.447164 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.447146 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-etc-tuned\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.447352 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.447334 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-tmp\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.448089 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.448067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chhh5\" (UniqueName: \"kubernetes.io/projected/276dfbb7-a93a-4da8-8b3b-f919c9642bca-kube-api-access-chhh5\") pod \"node-resolver-r5zq9\" (UID: \"276dfbb7-a93a-4da8-8b3b-f919c9642bca\") " pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.455345 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.455298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9lf\" (UniqueName: \"kubernetes.io/projected/7fbebf58-6bc4-4d05-8b31-3098996af4db-kube-api-access-vq9lf\") pod \"aws-ebs-csi-driver-node-vjgwg\" (UID: \"7fbebf58-6bc4-4d05-8b31-3098996af4db\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.455845 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.455654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns5hs\" (UniqueName: \"kubernetes.io/projected/229b10ec-c401-4070-945d-fa92e56f6443-kube-api-access-ns5hs\") pod \"iptables-alerter-zfsw2\" (UID: \"229b10ec-c401-4070-945d-fa92e56f6443\") " pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.455845 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.455662 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhl5j\" (UniqueName: \"kubernetes.io/projected/24c30e95-1770-4975-87be-9b1494d8904c-kube-api-access-mhl5j\") pod \"multus-additional-cni-plugins-wfmzk\" (UID: \"24c30e95-1770-4975-87be-9b1494d8904c\") " pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.456263 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.456139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8tmr\" (UniqueName: \"kubernetes.io/projected/33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd-kube-api-access-k8tmr\") pod \"tuned-k9mlc\" (UID: \"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd\") " pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.456705 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.456685 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhphc\" (UniqueName: \"kubernetes.io/projected/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-kube-api-access-lhphc\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:57:59.456776 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.456701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qj6z\" (UniqueName: \"kubernetes.io/projected/7eb134d5-89c6-46e0-ae15-02b2684b117a-kube-api-access-8qj6z\") pod \"ovnkube-node-bknrd\" (UID: \"7eb134d5-89c6-46e0-ae15-02b2684b117a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.457032 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.457014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxdf\" (UniqueName: \"kubernetes.io/projected/e7f6204a-5b3d-4f0f-8b89-3111e460af8a-kube-api-access-gdxdf\") pod \"node-ca-bw45p\" (UID: \"e7f6204a-5b3d-4f0f-8b89-3111e460af8a\") " pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.542925 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.542898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-var-lib-cni-multus\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.542933 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-conf-dir\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.542958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-os-release\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.542982 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-var-lib-cni-bin\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543005 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-system-cni-dir\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543007 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-var-lib-cni-multus\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543024 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-conf-dir\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-os-release\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543031 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-var-lib-kubelet\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543043 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-var-lib-cni-bin\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-var-lib-kubelet\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543076 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-system-cni-dir\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-hostroot\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543106 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-hostroot\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-etc-kubernetes\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543132 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jx82\" (UniqueName: \"kubernetes.io/projected/64236d25-036a-4831-9f0c-63b1efd05cc1-kube-api-access-4jx82\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543138 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-etc-kubernetes\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543174 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-run-k8s-cni-cncf-io\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543197 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-run-netns\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543224 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64236d25-036a-4831-9f0c-63b1efd05cc1-cni-binary-copy\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543270 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-run-netns\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543275 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-run-k8s-cni-cncf-io\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-cni-dir\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543328 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-cnibin\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-socket-dir-parent\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-daemon-config\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543388 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-cni-dir\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543399 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-run-multus-certs\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.543499 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543424 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:57:59.544217 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543431 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-socket-dir-parent\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.544217 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-cnibin\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.544217 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543489 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64236d25-036a-4831-9f0c-63b1efd05cc1-host-run-multus-certs\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.544217 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64236d25-036a-4831-9f0c-63b1efd05cc1-cni-binary-copy\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.544217 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.543905 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64236d25-036a-4831-9f0c-63b1efd05cc1-multus-daemon-config\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.548652 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.548631 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:57:59.548652 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.548654 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:57:59.548827 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.548666 2569 projected.go:194] Error preparing data for projected volume kube-api-access-hqj8d for pod openshift-network-diagnostics/network-check-target-vsvml: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:59.548827 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.548724 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d podName:97028780-9603-49f8-abdd-0a7e0a1cef8a nodeName:}" failed. No retries permitted until 2026-04-21 03:58:00.048707978 +0000 UTC m=+3.216104396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hqj8d" (UniqueName: "kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d") pod "network-check-target-vsvml" (UID: "97028780-9603-49f8-abdd-0a7e0a1cef8a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:57:59.550751 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.550733 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jx82\" (UniqueName: \"kubernetes.io/projected/64236d25-036a-4831-9f0c-63b1efd05cc1-kube-api-access-4jx82\") pod \"multus-lhbpw\" (UID: \"64236d25-036a-4831-9f0c-63b1efd05cc1\") " pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.626528 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.626494 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5zq9" Apr 21 03:57:59.635445 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.635414 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:57:59.644061 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.644025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zfsw2" Apr 21 03:57:59.648657 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.648639 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" Apr 21 03:57:59.654277 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.654259 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:57:59.659795 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.659780 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" Apr 21 03:57:59.667292 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.667274 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bw45p" Apr 21 03:57:59.673849 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.673827 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" Apr 21 03:57:59.679489 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.679444 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lhbpw" Apr 21 03:57:59.946365 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:57:59.946289 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:57:59.946509 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.946454 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:57:59.946552 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:57:59.946514 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs podName:03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:00.946499141 +0000 UTC m=+4.113895560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs") pod "network-metrics-daemon-cxzzc" (UID: "03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:00.017449 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:58:00.017424 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod276dfbb7_a93a_4da8_8b3b_f919c9642bca.slice/crio-362c3c57738a2bf2ab892d2d1fd63b014136a09adfcd9d1c0d6e3bbf51e8923a WatchSource:0}: Error finding container 362c3c57738a2bf2ab892d2d1fd63b014136a09adfcd9d1c0d6e3bbf51e8923a: Status 404 returned error can't find the container with id 362c3c57738a2bf2ab892d2d1fd63b014136a09adfcd9d1c0d6e3bbf51e8923a Apr 21 03:58:00.018753 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:58:00.018679 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eb134d5_89c6_46e0_ae15_02b2684b117a.slice/crio-9010b52ce5e673bab6f8ce71bbf944cfe92109e1cb3ad90c808ad078fae84c35 WatchSource:0}: Error finding container 9010b52ce5e673bab6f8ce71bbf944cfe92109e1cb3ad90c808ad078fae84c35: Status 404 returned error can't find the container with id 9010b52ce5e673bab6f8ce71bbf944cfe92109e1cb3ad90c808ad078fae84c35 Apr 21 03:58:00.024565 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:58:00.024476 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c3da277_f4fb_47af_9b13_a2de86f37142.slice/crio-2c8e6e8332af495f6621f47cde06b99bf813dc65fc99446457ba24427feaff4a WatchSource:0}: Error finding container 2c8e6e8332af495f6621f47cde06b99bf813dc65fc99446457ba24427feaff4a: Status 404 returned error can't find the container with id 2c8e6e8332af495f6621f47cde06b99bf813dc65fc99446457ba24427feaff4a Apr 21 03:58:00.025437 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:58:00.025413 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f6204a_5b3d_4f0f_8b89_3111e460af8a.slice/crio-86903b8ef29caa9c852ab7670d84c5e823bb0f56fd60e48d369038c110be09f5 WatchSource:0}: Error finding container 86903b8ef29caa9c852ab7670d84c5e823bb0f56fd60e48d369038c110be09f5: Status 404 returned error can't find the container with id 86903b8ef29caa9c852ab7670d84c5e823bb0f56fd60e48d369038c110be09f5 Apr 21 03:58:00.026181 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:58:00.026152 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24c30e95_1770_4975_87be_9b1494d8904c.slice/crio-84eebc8ea37f5cbed41d003160ac14f676c59f6634fe8e79755347d4a8cc61aa WatchSource:0}: Error finding container 84eebc8ea37f5cbed41d003160ac14f676c59f6634fe8e79755347d4a8cc61aa: Status 404 returned error can't find the container with id 84eebc8ea37f5cbed41d003160ac14f676c59f6634fe8e79755347d4a8cc61aa Apr 21 03:58:00.027108 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:58:00.027082 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64236d25_036a_4831_9f0c_63b1efd05cc1.slice/crio-aac412151e70c7a756c81f067b37817a31e911607dc59bfced71d3b0d62f4c44 WatchSource:0}: Error finding container aac412151e70c7a756c81f067b37817a31e911607dc59bfced71d3b0d62f4c44: Status 404 returned error can't find the container with id aac412151e70c7a756c81f067b37817a31e911607dc59bfced71d3b0d62f4c44 Apr 21 03:58:00.029900 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:58:00.028772 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fbebf58_6bc4_4d05_8b31_3098996af4db.slice/crio-b67dcbf6d15f96a1df956141a4c74d947f6b254df184f1328cc6fb5441388d6a WatchSource:0}: Error finding container b67dcbf6d15f96a1df956141a4c74d947f6b254df184f1328cc6fb5441388d6a: Status 404 returned error can't find the container with id b67dcbf6d15f96a1df956141a4c74d947f6b254df184f1328cc6fb5441388d6a Apr 21 03:58:00.147789 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.147762 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:00.147932 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:00.147902 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:58:00.147932 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:00.147921 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:58:00.147932 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:00.147932 2569 projected.go:194] Error preparing data for projected volume kube-api-access-hqj8d for pod openshift-network-diagnostics/network-check-target-vsvml: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:00.148097 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:00.147992 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d podName:97028780-9603-49f8-abdd-0a7e0a1cef8a nodeName:}" failed. No retries permitted until 2026-04-21 03:58:01.147972197 +0000 UTC m=+4.315368629 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hqj8d" (UniqueName: "kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d") pod "network-check-target-vsvml" (UID: "97028780-9603-49f8-abdd-0a7e0a1cef8a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:00.362258 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.362213 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 03:52:58 +0000 UTC" deadline="2027-09-30 11:50:39.229541329 +0000 UTC" Apr 21 03:58:00.362258 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.362261 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12655h52m38.867283964s" Apr 21 03:58:00.466743 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.466003 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:00.466743 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:00.466138 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:00.466743 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.466607 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:00.466743 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:00.466696 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:00.481873 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.481811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5zq9" event={"ID":"276dfbb7-a93a-4da8-8b3b-f919c9642bca","Type":"ContainerStarted","Data":"362c3c57738a2bf2ab892d2d1fd63b014136a09adfcd9d1c0d6e3bbf51e8923a"} Apr 21 03:58:00.486529 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.486478 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" event={"ID":"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd","Type":"ContainerStarted","Data":"5db4c0f3d90d903597752f9aa0e1e938e4460f715851baedc944f3ece17c5c4f"} Apr 21 03:58:00.493484 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.493453 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" event={"ID":"7fbebf58-6bc4-4d05-8b31-3098996af4db","Type":"ContainerStarted","Data":"b67dcbf6d15f96a1df956141a4c74d947f6b254df184f1328cc6fb5441388d6a"} Apr 21 03:58:00.497188 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.497151 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" event={"ID":"24c30e95-1770-4975-87be-9b1494d8904c","Type":"ContainerStarted","Data":"84eebc8ea37f5cbed41d003160ac14f676c59f6634fe8e79755347d4a8cc61aa"} Apr 21 03:58:00.500190 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.500145 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lhbpw" event={"ID":"64236d25-036a-4831-9f0c-63b1efd05cc1","Type":"ContainerStarted","Data":"aac412151e70c7a756c81f067b37817a31e911607dc59bfced71d3b0d62f4c44"} Apr 21 03:58:00.501782 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.501743 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fs2vm" event={"ID":"5c3da277-f4fb-47af-9b13-a2de86f37142","Type":"ContainerStarted","Data":"2c8e6e8332af495f6621f47cde06b99bf813dc65fc99446457ba24427feaff4a"} Apr 21 03:58:00.504028 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.503980 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zfsw2" event={"ID":"229b10ec-c401-4070-945d-fa92e56f6443","Type":"ContainerStarted","Data":"b2b7df37990714fe9119f60697ad0734cd59a4cb492b21c4b7f89e5341cf7a7f"} Apr 21 03:58:00.511329 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.510609 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" event={"ID":"4c5556f6af36052a906fa0aef20bfb6c","Type":"ContainerStarted","Data":"6357d2a80faadacdd26bc8a62861210f3b0886b2a1765fd1e36c553567b79da0"} Apr 21 03:58:00.520645 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.520617 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bw45p" event={"ID":"e7f6204a-5b3d-4f0f-8b89-3111e460af8a","Type":"ContainerStarted","Data":"86903b8ef29caa9c852ab7670d84c5e823bb0f56fd60e48d369038c110be09f5"} Apr 21 03:58:00.532838 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.532808 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerStarted","Data":"9010b52ce5e673bab6f8ce71bbf944cfe92109e1cb3ad90c808ad078fae84c35"} Apr 21 03:58:00.863787 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.863758 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 03:58:00.953493 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:00.952861 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:00.953493 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:00.953062 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:00.953493 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:00.953123 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs podName:03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:02.953105302 +0000 UTC m=+6.120501722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs") pod "network-metrics-daemon-cxzzc" (UID: "03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:01.157068 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:01.157035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:01.157195 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:01.157186 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:58:01.157275 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:01.157203 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:58:01.157275 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:01.157216 2569 projected.go:194] Error preparing data for projected volume kube-api-access-hqj8d for pod openshift-network-diagnostics/network-check-target-vsvml: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:01.157369 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:01.157288 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d podName:97028780-9603-49f8-abdd-0a7e0a1cef8a nodeName:}" failed. No retries permitted until 2026-04-21 03:58:03.157270608 +0000 UTC m=+6.324667026 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hqj8d" (UniqueName: "kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d") pod "network-check-target-vsvml" (UID: "97028780-9603-49f8-abdd-0a7e0a1cef8a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:01.546974 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:01.546388 2569 generic.go:358] "Generic (PLEG): container finished" podID="ba5550cc00f884a90499316ee8207508" containerID="580183d0454d32b497fc1d89dc670ba2ebcfa772764674515d82b73ba51958f1" exitCode=0 Apr 21 03:58:01.546974 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:01.546508 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" event={"ID":"ba5550cc00f884a90499316ee8207508","Type":"ContainerDied","Data":"580183d0454d32b497fc1d89dc670ba2ebcfa772764674515d82b73ba51958f1"} Apr 21 03:58:01.561725 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:01.560546 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-120.ec2.internal" podStartSLOduration=3.560529699 podStartE2EDuration="3.560529699s" podCreationTimestamp="2026-04-21 03:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:58:00.528730425 +0000 UTC m=+3.696126879" watchObservedRunningTime="2026-04-21 03:58:01.560529699 +0000 UTC m=+4.727926132" Apr 21 03:58:02.466399 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:02.465704 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:02.466399 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:02.465828 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:02.466399 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:02.466259 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:02.466399 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:02.466351 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:02.555273 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:02.555029 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" event={"ID":"ba5550cc00f884a90499316ee8207508","Type":"ContainerStarted","Data":"78aa2de0e4c063b2adcfdf3f0136dff1b30393d0e72664d0e1f85093f01452b2"} Apr 21 03:58:02.570968 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:02.570070 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-120.ec2.internal" podStartSLOduration=4.5700497890000005 podStartE2EDuration="4.570049789s" podCreationTimestamp="2026-04-21 03:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:58:02.569860672 +0000 UTC m=+5.737257111" watchObservedRunningTime="2026-04-21 03:58:02.570049789 +0000 UTC m=+5.737446230" Apr 21 03:58:02.972924 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:02.972315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:02.972924 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:02.972503 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:02.972924 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:02.972569 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs podName:03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:06.972551522 +0000 UTC m=+10.139947960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs") pod "network-metrics-daemon-cxzzc" (UID: "03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:03.174762 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:03.174220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:03.174762 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:03.174407 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:58:03.174762 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:03.174424 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:58:03.174762 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:03.174435 2569 projected.go:194] Error preparing data for projected volume kube-api-access-hqj8d for pod openshift-network-diagnostics/network-check-target-vsvml: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:03.174762 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:03.174493 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d podName:97028780-9603-49f8-abdd-0a7e0a1cef8a nodeName:}" failed. No retries permitted until 2026-04-21 03:58:07.17447362 +0000 UTC m=+10.341870056 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hqj8d" (UniqueName: "kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d") pod "network-check-target-vsvml" (UID: "97028780-9603-49f8-abdd-0a7e0a1cef8a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:04.465756 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:04.465720 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:04.466167 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:04.465720 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:04.466167 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:04.465878 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:04.466167 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:04.465937 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:06.465949 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:06.465910 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:06.466360 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:06.466027 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:06.466397 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:06.466376 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:06.466450 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:06.466433 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:07.009139 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:07.008546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:07.009139 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:07.008784 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:07.009139 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:07.008840 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs podName:03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:15.008826827 +0000 UTC m=+18.176223242 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs") pod "network-metrics-daemon-cxzzc" (UID: "03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:07.210199 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:07.210145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:07.210400 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:07.210385 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:58:07.210462 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:07.210406 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:58:07.210462 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:07.210420 2569 projected.go:194] Error preparing data for projected volume kube-api-access-hqj8d for pod openshift-network-diagnostics/network-check-target-vsvml: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:07.210553 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:07.210476 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d podName:97028780-9603-49f8-abdd-0a7e0a1cef8a nodeName:}" failed. No retries permitted until 2026-04-21 03:58:15.2104629 +0000 UTC m=+18.377859316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hqj8d" (UniqueName: "kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d") pod "network-check-target-vsvml" (UID: "97028780-9603-49f8-abdd-0a7e0a1cef8a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:08.465745 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:08.465706 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:08.466231 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:08.465873 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:08.466231 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:08.465932 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:08.466231 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:08.466059 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:10.466453 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:10.466417 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:10.466862 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:10.466546 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:10.466862 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:10.466590 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:10.466862 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:10.466678 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:12.465841 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:12.465811 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:12.466286 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:12.465813 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:12.466286 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:12.465931 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:12.466286 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:12.466005 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:14.466063 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:14.466030 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:14.466590 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:14.466029 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:14.466590 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:14.466168 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:14.466590 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:14.466233 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:15.070226 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:15.070180 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:15.070432 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:15.070348 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:15.070432 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:15.070422 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs podName:03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:31.070399507 +0000 UTC m=+34.237795924 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs") pod "network-metrics-daemon-cxzzc" (UID: "03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:15.271203 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:15.271163 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:15.271397 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:15.271364 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:58:15.271397 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:15.271388 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:58:15.271499 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:15.271401 2569 projected.go:194] Error preparing data for projected volume kube-api-access-hqj8d for pod openshift-network-diagnostics/network-check-target-vsvml: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:15.271499 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:15.271469 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d podName:97028780-9603-49f8-abdd-0a7e0a1cef8a nodeName:}" failed. No retries permitted until 2026-04-21 03:58:31.271450521 +0000 UTC m=+34.438846951 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hqj8d" (UniqueName: "kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d") pod "network-check-target-vsvml" (UID: "97028780-9603-49f8-abdd-0a7e0a1cef8a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:16.465775 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:16.465736 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:16.466215 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:16.465738 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:16.466215 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:16.465860 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:16.466215 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:16.465933 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:17.612764 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:17.612526 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fs2vm" event={"ID":"5c3da277-f4fb-47af-9b13-a2de86f37142","Type":"ContainerStarted","Data":"e50abd857cb13e62adb497930c0fb46e4399e333e74e6f8b9760d52775932d94"} Apr 21 03:58:17.627254 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:17.627148 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fs2vm" podStartSLOduration=7.996008447 podStartE2EDuration="20.627132347s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:00.025998342 +0000 UTC m=+3.193394758" lastFinishedPulling="2026-04-21 03:58:12.657122239 +0000 UTC m=+15.824518658" observedRunningTime="2026-04-21 03:58:17.626955007 +0000 UTC m=+20.794351463" watchObservedRunningTime="2026-04-21 03:58:17.627132347 +0000 UTC m=+20.794528785" Apr 21 03:58:18.465958 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.465786 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:18.466122 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.465786 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:18.466122 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:18.466067 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:18.466122 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:18.466089 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:18.619317 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.619280 2569 generic.go:358] "Generic (PLEG): container finished" podID="24c30e95-1770-4975-87be-9b1494d8904c" containerID="1d8f57c43becab72128f818ee35be4d8c6568efc2f944eab333d7b3f5650828d" exitCode=0 Apr 21 03:58:18.620026 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.619344 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" event={"ID":"24c30e95-1770-4975-87be-9b1494d8904c","Type":"ContainerDied","Data":"1d8f57c43becab72128f818ee35be4d8c6568efc2f944eab333d7b3f5650828d"} Apr 21 03:58:18.620660 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.620639 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lhbpw" event={"ID":"64236d25-036a-4831-9f0c-63b1efd05cc1","Type":"ContainerStarted","Data":"a6b05b38449fec1e33e5ddd04c9ce175b8c955d6a3d8d0355dc1008046b7137b"} Apr 21 03:58:18.621896 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.621869 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bw45p" event={"ID":"e7f6204a-5b3d-4f0f-8b89-3111e460af8a","Type":"ContainerStarted","Data":"28ce58fa017c6871ee25e0a7de99809abaf5677284f4d8a85463446ccfa48ad9"} Apr 21 03:58:18.624344 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.624302 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 03:58:18.624820 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.624784 2569 generic.go:358] "Generic (PLEG): container finished" podID="7eb134d5-89c6-46e0-ae15-02b2684b117a" containerID="f123c611f32183e354f6f6bfe14538d908959fb56b27216d741e7c009485fe70" exitCode=1 Apr 21 03:58:18.624914 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.624879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerStarted","Data":"37d525b41a0a9c05ed4c6a63def860fa284ddfe402f33108a4eb2dd5d7238e95"} Apr 21 03:58:18.624964 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.624914 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerStarted","Data":"f51e2984ceffbe0a111bcbdc53f1ddc60fdfa9c41f8ec33186215ae91f268cd3"} Apr 21 03:58:18.624964 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.624931 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerStarted","Data":"8e52fdfbfea85fa4511f4079a71eddb20a03789074f4cebe09f46e6a13fbc785"} Apr 21 03:58:18.624964 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.624943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerStarted","Data":"3d2223516a8c33ee11b1f0596eb8d61aa25a15dbab8cc975d91e401cbd10eff0"} Apr 21 03:58:18.625090 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.624964 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerStarted","Data":"80809fc408d79dced60077598726b0b83de700ba6f03cfa05b0e999a5206bc38"} Apr 21 03:58:18.625090 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.624977 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerDied","Data":"f123c611f32183e354f6f6bfe14538d908959fb56b27216d741e7c009485fe70"} Apr 21 03:58:18.627491 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.627464 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5zq9" event={"ID":"276dfbb7-a93a-4da8-8b3b-f919c9642bca","Type":"ContainerStarted","Data":"25488d0d54626a2e33ddcb59c8b8a89e4532d511795157fd37651677c6bf1ec0"} Apr 21 03:58:18.629181 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.629156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" event={"ID":"33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd","Type":"ContainerStarted","Data":"91ee7bb447332646f236929ac42833b2c3a562c2dba7ee1ab27e2d947e35c685"} Apr 21 03:58:18.630259 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.630221 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" event={"ID":"7fbebf58-6bc4-4d05-8b31-3098996af4db","Type":"ContainerStarted","Data":"ac68dbf0c29d2775d80b51ed1cde57800bd11bebcc06716999bfe386ac0300be"} Apr 21 03:58:18.681866 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.681824 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bw45p" podStartSLOduration=4.403532323 podStartE2EDuration="21.68180842s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:00.029027019 +0000 UTC m=+3.196423435" lastFinishedPulling="2026-04-21 03:58:17.307303101 +0000 UTC m=+20.474699532" observedRunningTime="2026-04-21 03:58:18.658083112 +0000 UTC m=+21.825479549" watchObservedRunningTime="2026-04-21 03:58:18.68180842 +0000 UTC m=+21.849204859" Apr 21 03:58:18.681990 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.681970 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lhbpw" podStartSLOduration=4.347037421 podStartE2EDuration="21.681965786s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:00.030416722 +0000 UTC m=+3.197813139" lastFinishedPulling="2026-04-21 03:58:17.365345083 +0000 UTC m=+20.532741504" observedRunningTime="2026-04-21 03:58:18.681741729 +0000 UTC m=+21.849138167" watchObservedRunningTime="2026-04-21 03:58:18.681965786 +0000 UTC m=+21.849362223" Apr 21 03:58:18.698610 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.698572 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r5zq9" podStartSLOduration=4.41066438 podStartE2EDuration="21.698561826s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:00.019406654 +0000 UTC m=+3.186803070" lastFinishedPulling="2026-04-21 03:58:17.307304099 +0000 UTC m=+20.474700516" observedRunningTime="2026-04-21 03:58:18.698407542 +0000 UTC m=+21.865803980" watchObservedRunningTime="2026-04-21 03:58:18.698561826 +0000 UTC m=+21.865958263" Apr 21 03:58:18.714511 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:18.714477 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-k9mlc" podStartSLOduration=4.418148168 podStartE2EDuration="21.71446724s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:00.031611153 +0000 UTC m=+3.199007570" lastFinishedPulling="2026-04-21 03:58:17.327930219 +0000 UTC m=+20.495326642" observedRunningTime="2026-04-21 03:58:18.714298899 +0000 UTC m=+21.881695337" watchObservedRunningTime="2026-04-21 03:58:18.71446724 +0000 UTC m=+21.881863677" Apr 21 03:58:19.044934 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:19.044908 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 03:58:19.402384 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:19.402272 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T03:58:19.044927006Z","UUID":"1e64f87c-975e-4ee6-92ea-502ebd3a982e","Handler":null,"Name":"","Endpoint":""} Apr 21 03:58:19.405741 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:19.405700 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 03:58:19.405741 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:19.405732 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 03:58:19.634540 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:19.634505 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" event={"ID":"7fbebf58-6bc4-4d05-8b31-3098996af4db","Type":"ContainerStarted","Data":"5947696e53a47555546a9812a9210b735749579441078658647735ee521d10c9"} Apr 21 03:58:19.636570 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:19.636061 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zfsw2" event={"ID":"229b10ec-c401-4070-945d-fa92e56f6443","Type":"ContainerStarted","Data":"488900ee9f54c8527ab57541689ea582ed5c09b2aa9266b9c08cbfaa632b8ffa"} Apr 21 03:58:19.650579 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:19.650535 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zfsw2" podStartSLOduration=5.345798423 podStartE2EDuration="22.650523134s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:00.023257027 +0000 UTC m=+3.190653458" lastFinishedPulling="2026-04-21 03:58:17.327981747 +0000 UTC m=+20.495378169" observedRunningTime="2026-04-21 03:58:19.650362264 +0000 UTC m=+22.817758703" watchObservedRunningTime="2026-04-21 03:58:19.650523134 +0000 UTC m=+22.817919572" Apr 21 03:58:19.814631 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:19.814547 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:58:19.815289 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:19.815269 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:58:20.466748 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:20.466413 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:20.466748 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:20.466686 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:20.466983 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:20.466413 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:20.467082 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:20.467042 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:20.640591 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:20.640567 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 03:58:20.642359 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:20.642326 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerStarted","Data":"ce3d0d76043dd9c87958f6493d2ce39a8646ee265bbb9781d617a83401a76cee"} Apr 21 03:58:20.644298 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:20.644274 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" event={"ID":"7fbebf58-6bc4-4d05-8b31-3098996af4db","Type":"ContainerStarted","Data":"d83d48d798d487661c900c49c61e24a4e2a13f9d81b596f98db68d2314fae7c6"} Apr 21 03:58:20.644731 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:20.644709 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:58:20.645365 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:20.645348 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fs2vm" Apr 21 03:58:20.660859 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:20.660806 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vjgwg" podStartSLOduration=3.497385853 podStartE2EDuration="23.66079626s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:00.031301797 +0000 UTC m=+3.198698238" lastFinishedPulling="2026-04-21 03:58:20.194712225 +0000 UTC m=+23.362108645" observedRunningTime="2026-04-21 03:58:20.66047906 +0000 UTC m=+23.827875499" watchObservedRunningTime="2026-04-21 03:58:20.66079626 +0000 UTC m=+23.828192698" Apr 21 03:58:22.466094 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:22.466062 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:22.466758 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:22.466198 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:22.466758 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:22.466284 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:22.466758 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:22.466365 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:22.651567 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:22.651544 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 03:58:23.654937 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:23.654742 2569 generic.go:358] "Generic (PLEG): container finished" podID="24c30e95-1770-4975-87be-9b1494d8904c" containerID="d4d0c1dddf3d7f5d131f2e67a489aaca048c6b07014eb80ffb4198ce3182ab1a" exitCode=0 Apr 21 03:58:23.655506 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:23.654831 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" event={"ID":"24c30e95-1770-4975-87be-9b1494d8904c","Type":"ContainerDied","Data":"d4d0c1dddf3d7f5d131f2e67a489aaca048c6b07014eb80ffb4198ce3182ab1a"} Apr 21 03:58:23.658184 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:23.658166 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 03:58:23.658498 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:23.658468 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerStarted","Data":"446c40d63997fd9ce817dd25181751fd2266e385555dccbd09e03e279f4b504c"} Apr 21 03:58:23.658920 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:23.658902 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:58:23.659008 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:23.658930 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:58:23.659008 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:23.658998 2569 scope.go:117] "RemoveContainer" containerID="f123c611f32183e354f6f6bfe14538d908959fb56b27216d741e7c009485fe70" Apr 21 03:58:23.675178 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:23.675089 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:58:24.466113 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.466082 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:24.466231 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.466082 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:24.466231 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:24.466190 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:24.466385 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:24.466291 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:24.555507 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.555442 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vsvml"] Apr 21 03:58:24.558322 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.558299 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cxzzc"] Apr 21 03:58:24.662194 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.662161 2569 generic.go:358] "Generic (PLEG): container finished" podID="24c30e95-1770-4975-87be-9b1494d8904c" containerID="999e33cd9abbdcff0e9d26cfa619e29a5e667a3cc013b66267f507e96ac70d1d" exitCode=0 Apr 21 03:58:24.662644 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.662258 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" event={"ID":"24c30e95-1770-4975-87be-9b1494d8904c","Type":"ContainerDied","Data":"999e33cd9abbdcff0e9d26cfa619e29a5e667a3cc013b66267f507e96ac70d1d"} Apr 21 03:58:24.665594 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.665576 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 03:58:24.665917 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.665899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" event={"ID":"7eb134d5-89c6-46e0-ae15-02b2684b117a","Type":"ContainerStarted","Data":"0b7dfdee80e3685b3006795fb24451ddf6940bd7452f470cfc1c6306d6f3db25"} Apr 21 03:58:24.665981 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.665938 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:24.665981 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.665962 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:24.666055 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:24.666037 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:24.666136 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.666123 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:58:24.666258 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:24.666226 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:24.681196 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:24.681169 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:58:25.669058 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:25.669022 2569 generic.go:358] "Generic (PLEG): container finished" podID="24c30e95-1770-4975-87be-9b1494d8904c" containerID="68e01072ad71295c93616de175685d472964f9a52e68e258941a09c230a4ada4" exitCode=0 Apr 21 03:58:25.669489 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:25.669105 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" event={"ID":"24c30e95-1770-4975-87be-9b1494d8904c","Type":"ContainerDied","Data":"68e01072ad71295c93616de175685d472964f9a52e68e258941a09c230a4ada4"} Apr 21 03:58:25.689787 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:25.689751 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" podStartSLOduration=11.292945687 podStartE2EDuration="28.689739111s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:00.021701926 +0000 UTC m=+3.189098357" lastFinishedPulling="2026-04-21 03:58:17.418495347 +0000 UTC m=+20.585891781" observedRunningTime="2026-04-21 03:58:24.709256918 +0000 UTC m=+27.876653356" watchObservedRunningTime="2026-04-21 03:58:25.689739111 +0000 UTC m=+28.857135550" Apr 21 03:58:26.465874 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:26.465845 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:26.466064 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:26.465855 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:26.466064 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:26.465971 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:26.466064 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:26.466014 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:28.466495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:28.466465 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:28.466495 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:28.466503 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:28.467125 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:28.466582 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:28.467125 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:28.466725 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:30.466556 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.466480 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:30.467169 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.466484 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:30.467169 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:30.466611 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxzzc" podUID="03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3" Apr 21 03:58:30.467169 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:30.466699 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vsvml" podUID="97028780-9603-49f8-abdd-0a7e0a1cef8a" Apr 21 03:58:30.603689 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.603657 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-120.ec2.internal" event="NodeReady" Apr 21 03:58:30.603855 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.603793 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 03:58:30.645646 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.645610 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p2sxx"] Apr 21 03:58:30.650062 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.650035 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xmfjq"] Apr 21 03:58:30.650252 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.650216 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.652716 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.652692 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7tx7c\"" Apr 21 03:58:30.652821 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.652726 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 03:58:30.652821 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.652749 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 03:58:30.653423 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.653405 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:30.655455 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.655432 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7mknc\"" Apr 21 03:58:30.655559 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.655467 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 03:58:30.655559 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.655497 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 03:58:30.655656 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.655649 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 03:58:30.661924 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.661901 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p2sxx"] Apr 21 03:58:30.662024 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.661934 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmfjq"] Apr 21 03:58:30.781262 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.781177 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/582f3951-c97b-47a0-9cb1-39d85f78b692-tmp-dir\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.781262 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.781221 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.781475 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.781266 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l7k5\" (UniqueName: \"kubernetes.io/projected/582f3951-c97b-47a0-9cb1-39d85f78b692-kube-api-access-7l7k5\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.781475 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.781285 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmbcw\" (UniqueName: \"kubernetes.io/projected/4568e112-4a47-4610-a776-ed8ac69d2ca9-kube-api-access-xmbcw\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:30.781475 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.781313 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/582f3951-c97b-47a0-9cb1-39d85f78b692-config-volume\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.781475 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.781331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:30.882695 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.882655 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.882937 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.882708 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l7k5\" (UniqueName: \"kubernetes.io/projected/582f3951-c97b-47a0-9cb1-39d85f78b692-kube-api-access-7l7k5\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.882937 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.882731 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmbcw\" (UniqueName: \"kubernetes.io/projected/4568e112-4a47-4610-a776-ed8ac69d2ca9-kube-api-access-xmbcw\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:30.882937 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.882761 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/582f3951-c97b-47a0-9cb1-39d85f78b692-config-volume\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.882937 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.882785 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:30.882937 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:30.882842 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:30.882937 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.882861 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/582f3951-c97b-47a0-9cb1-39d85f78b692-tmp-dir\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.882937 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:30.882910 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls podName:582f3951-c97b-47a0-9cb1-39d85f78b692 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:31.382893106 +0000 UTC m=+34.550289527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls") pod "dns-default-p2sxx" (UID: "582f3951-c97b-47a0-9cb1-39d85f78b692") : secret "dns-default-metrics-tls" not found Apr 21 03:58:30.883317 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:30.883013 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:30.883317 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:30.883064 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert podName:4568e112-4a47-4610-a776-ed8ac69d2ca9 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:31.383046182 +0000 UTC m=+34.550442614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert") pod "ingress-canary-xmfjq" (UID: "4568e112-4a47-4610-a776-ed8ac69d2ca9") : secret "canary-serving-cert" not found Apr 21 03:58:30.883317 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.883187 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/582f3951-c97b-47a0-9cb1-39d85f78b692-tmp-dir\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.883441 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.883358 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/582f3951-c97b-47a0-9cb1-39d85f78b692-config-volume\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.894105 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.894081 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l7k5\" (UniqueName: \"kubernetes.io/projected/582f3951-c97b-47a0-9cb1-39d85f78b692-kube-api-access-7l7k5\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:30.894294 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:30.894273 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmbcw\" (UniqueName: \"kubernetes.io/projected/4568e112-4a47-4610-a776-ed8ac69d2ca9-kube-api-access-xmbcw\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:31.084332 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:31.084256 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:31.084491 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.084425 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:31.084555 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.084515 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs podName:03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:03.084494639 +0000 UTC m=+66.251891055 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs") pod "network-metrics-daemon-cxzzc" (UID: "03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 03:58:31.285199 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:31.285167 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:31.285367 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.285310 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 03:58:31.285367 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.285324 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 03:58:31.285367 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.285333 2569 projected.go:194] Error preparing data for projected volume kube-api-access-hqj8d for pod openshift-network-diagnostics/network-check-target-vsvml: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:31.285468 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.285381 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d podName:97028780-9603-49f8-abdd-0a7e0a1cef8a nodeName:}" failed. No retries permitted until 2026-04-21 03:59:03.285366556 +0000 UTC m=+66.452762971 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hqj8d" (UniqueName: "kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d") pod "network-check-target-vsvml" (UID: "97028780-9603-49f8-abdd-0a7e0a1cef8a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 03:58:31.385940 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:31.385911 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:31.386084 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:31.385949 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:31.386084 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.386050 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:31.386084 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.386055 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:31.386176 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.386109 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls podName:582f3951-c97b-47a0-9cb1-39d85f78b692 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:32.386094607 +0000 UTC m=+35.553491034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls") pod "dns-default-p2sxx" (UID: "582f3951-c97b-47a0-9cb1-39d85f78b692") : secret "dns-default-metrics-tls" not found Apr 21 03:58:31.386176 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:31.386122 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert podName:4568e112-4a47-4610-a776-ed8ac69d2ca9 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:32.386115886 +0000 UTC m=+35.553512302 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert") pod "ingress-canary-xmfjq" (UID: "4568e112-4a47-4610-a776-ed8ac69d2ca9") : secret "canary-serving-cert" not found Apr 21 03:58:31.683634 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:31.683547 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" event={"ID":"24c30e95-1770-4975-87be-9b1494d8904c","Type":"ContainerStarted","Data":"e1b790f4f80aa0e19aa6a497dc82ed291db06396d8aeaef65ca0c94aa8a7c98e"} Apr 21 03:58:32.391903 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.391872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:32.392067 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.391915 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:32.392067 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:32.392034 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:32.392067 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:32.392040 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:32.392169 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:32.392101 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls podName:582f3951-c97b-47a0-9cb1-39d85f78b692 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:34.392082932 +0000 UTC m=+37.559479348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls") pod "dns-default-p2sxx" (UID: "582f3951-c97b-47a0-9cb1-39d85f78b692") : secret "dns-default-metrics-tls" not found Apr 21 03:58:32.392169 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:32.392118 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert podName:4568e112-4a47-4610-a776-ed8ac69d2ca9 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:34.39211018 +0000 UTC m=+37.559506597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert") pod "ingress-canary-xmfjq" (UID: "4568e112-4a47-4610-a776-ed8ac69d2ca9") : secret "canary-serving-cert" not found Apr 21 03:58:32.465769 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.465737 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:58:32.465769 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.465778 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:58:32.468312 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.468290 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:58:32.468438 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.468294 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:58:32.468438 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.468296 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:58:32.469059 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.469043 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k29s5\"" Apr 21 03:58:32.469144 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.469077 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sjhjf\"" Apr 21 03:58:32.687642 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.687562 2569 generic.go:358] "Generic (PLEG): container finished" podID="24c30e95-1770-4975-87be-9b1494d8904c" containerID="e1b790f4f80aa0e19aa6a497dc82ed291db06396d8aeaef65ca0c94aa8a7c98e" exitCode=0 Apr 21 03:58:32.687642 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:32.687613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" event={"ID":"24c30e95-1770-4975-87be-9b1494d8904c","Type":"ContainerDied","Data":"e1b790f4f80aa0e19aa6a497dc82ed291db06396d8aeaef65ca0c94aa8a7c98e"} Apr 21 03:58:33.691714 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:33.691541 2569 generic.go:358] "Generic (PLEG): container finished" podID="24c30e95-1770-4975-87be-9b1494d8904c" containerID="e0a138b6e5fb06770b7512e716aaff8755f8edfd1dd811ae6636b02ef8b08750" exitCode=0 Apr 21 03:58:33.692052 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:33.691623 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" event={"ID":"24c30e95-1770-4975-87be-9b1494d8904c","Type":"ContainerDied","Data":"e0a138b6e5fb06770b7512e716aaff8755f8edfd1dd811ae6636b02ef8b08750"} Apr 21 03:58:34.403704 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:34.403672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:34.403855 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:34.403734 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:34.403855 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:34.403807 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:34.403855 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:34.403827 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:34.403949 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:34.403869 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert podName:4568e112-4a47-4610-a776-ed8ac69d2ca9 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:38.403853455 +0000 UTC m=+41.571249875 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert") pod "ingress-canary-xmfjq" (UID: "4568e112-4a47-4610-a776-ed8ac69d2ca9") : secret "canary-serving-cert" not found Apr 21 03:58:34.403949 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:34.403882 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls podName:582f3951-c97b-47a0-9cb1-39d85f78b692 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:38.403876513 +0000 UTC m=+41.571272929 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls") pod "dns-default-p2sxx" (UID: "582f3951-c97b-47a0-9cb1-39d85f78b692") : secret "dns-default-metrics-tls" not found Apr 21 03:58:34.696301 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:34.696217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" event={"ID":"24c30e95-1770-4975-87be-9b1494d8904c","Type":"ContainerStarted","Data":"5924fbf8539062dc24ed632c0c0ceaa39ac23df430313fae94771dc0bd5eed5d"} Apr 21 03:58:34.719156 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:34.719112 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wfmzk" podStartSLOduration=6.282302041 podStartE2EDuration="37.719097994s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:58:00.029160472 +0000 UTC m=+3.196556894" lastFinishedPulling="2026-04-21 03:58:31.465956414 +0000 UTC m=+34.633352847" observedRunningTime="2026-04-21 03:58:34.717563754 +0000 UTC m=+37.884960190" watchObservedRunningTime="2026-04-21 03:58:34.719097994 +0000 UTC m=+37.886494432" Apr 21 03:58:38.427484 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:38.427448 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:38.427924 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:38.427493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:38.427924 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:38.427612 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:38.427924 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:38.427624 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:38.427924 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:38.427665 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert podName:4568e112-4a47-4610-a776-ed8ac69d2ca9 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:46.427650695 +0000 UTC m=+49.595047135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert") pod "ingress-canary-xmfjq" (UID: "4568e112-4a47-4610-a776-ed8ac69d2ca9") : secret "canary-serving-cert" not found Apr 21 03:58:38.427924 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:38.427701 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls podName:582f3951-c97b-47a0-9cb1-39d85f78b692 nodeName:}" failed. No retries permitted until 2026-04-21 03:58:46.42768343 +0000 UTC m=+49.595079857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls") pod "dns-default-p2sxx" (UID: "582f3951-c97b-47a0-9cb1-39d85f78b692") : secret "dns-default-metrics-tls" not found Apr 21 03:58:46.482328 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:46.482288 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:58:46.482819 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:46.482343 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:58:46.482819 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:46.482438 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:58:46.482819 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:46.482439 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:58:46.482819 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:46.482510 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls podName:582f3951-c97b-47a0-9cb1-39d85f78b692 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:02.482490223 +0000 UTC m=+65.649886661 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls") pod "dns-default-p2sxx" (UID: "582f3951-c97b-47a0-9cb1-39d85f78b692") : secret "dns-default-metrics-tls" not found Apr 21 03:58:46.482819 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:58:46.482529 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert podName:4568e112-4a47-4610-a776-ed8ac69d2ca9 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:02.482520382 +0000 UTC m=+65.649916807 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert") pod "ingress-canary-xmfjq" (UID: "4568e112-4a47-4610-a776-ed8ac69d2ca9") : secret "canary-serving-cert" not found Apr 21 03:58:56.681593 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:58:56.681566 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bknrd" Apr 21 03:59:02.489573 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:02.489534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:59:02.489573 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:02.489581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:59:02.489995 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:02.489682 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:59:02.489995 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:02.489687 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:59:02.489995 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:02.489746 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert podName:4568e112-4a47-4610-a776-ed8ac69d2ca9 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:34.489726788 +0000 UTC m=+97.657123217 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert") pod "ingress-canary-xmfjq" (UID: "4568e112-4a47-4610-a776-ed8ac69d2ca9") : secret "canary-serving-cert" not found Apr 21 03:59:02.489995 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:02.489763 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls podName:582f3951-c97b-47a0-9cb1-39d85f78b692 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:34.489756145 +0000 UTC m=+97.657152561 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls") pod "dns-default-p2sxx" (UID: "582f3951-c97b-47a0-9cb1-39d85f78b692") : secret "dns-default-metrics-tls" not found Apr 21 03:59:03.094481 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.094439 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 03:59:03.097137 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.097118 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 03:59:03.104981 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:03.104964 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 03:59:03.105055 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:03.105018 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs podName:03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:07.105004147 +0000 UTC m=+130.272400563 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs") pod "network-metrics-daemon-cxzzc" (UID: "03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3") : secret "metrics-daemon-secret" not found Apr 21 03:59:03.295649 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.295594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:59:03.298126 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.298109 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 03:59:03.307945 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.307912 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 03:59:03.319841 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.319810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqj8d\" (UniqueName: \"kubernetes.io/projected/97028780-9603-49f8-abdd-0a7e0a1cef8a-kube-api-access-hqj8d\") pod \"network-check-target-vsvml\" (UID: \"97028780-9603-49f8-abdd-0a7e0a1cef8a\") " pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:59:03.382625 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.382553 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sjhjf\"" Apr 21 03:59:03.390375 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.390352 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:59:03.516533 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.516503 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vsvml"] Apr 21 03:59:03.520047 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:59:03.520002 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97028780_9603_49f8_abdd_0a7e0a1cef8a.slice/crio-71d9a1b7b938db12e1b98f636a67bec4c725531faca9342658f425274c208197 WatchSource:0}: Error finding container 71d9a1b7b938db12e1b98f636a67bec4c725531faca9342658f425274c208197: Status 404 returned error can't find the container with id 71d9a1b7b938db12e1b98f636a67bec4c725531faca9342658f425274c208197 Apr 21 03:59:03.753455 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:03.753367 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vsvml" event={"ID":"97028780-9603-49f8-abdd-0a7e0a1cef8a","Type":"ContainerStarted","Data":"71d9a1b7b938db12e1b98f636a67bec4c725531faca9342658f425274c208197"} Apr 21 03:59:06.760344 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:06.760216 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vsvml" event={"ID":"97028780-9603-49f8-abdd-0a7e0a1cef8a","Type":"ContainerStarted","Data":"82e73431fff9d9d904911c9f45b770409420aa962de92c6b01a068d17150903e"} Apr 21 03:59:06.760682 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:06.760464 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:59:06.775190 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:06.775152 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vsvml" podStartSLOduration=67.193816942 podStartE2EDuration="1m9.77514166s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 03:59:03.522276206 +0000 UTC m=+66.689672626" lastFinishedPulling="2026-04-21 03:59:06.103600909 +0000 UTC m=+69.270997344" observedRunningTime="2026-04-21 03:59:06.774779092 +0000 UTC m=+69.942175541" watchObservedRunningTime="2026-04-21 03:59:06.77514166 +0000 UTC m=+69.942538098" Apr 21 03:59:23.327497 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.327456 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t"] Apr 21 03:59:23.332310 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.332285 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t" Apr 21 03:59:23.334675 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.334647 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:59:23.335485 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.335465 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-7hbrx\"" Apr 21 03:59:23.335587 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.335471 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 03:59:23.336322 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.336302 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t"] Apr 21 03:59:23.430315 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.430290 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-657t5"] Apr 21 03:59:23.433053 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.433038 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.435424 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.435405 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 03:59:23.435554 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.435494 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 03:59:23.435818 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.435799 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 03:59:23.436023 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.436005 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-4xztn\"" Apr 21 03:59:23.436023 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.436020 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 03:59:23.440936 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.440913 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-657t5"] Apr 21 03:59:23.441442 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.441425 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 03:59:23.527477 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.527441 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpgtv\" (UniqueName: \"kubernetes.io/projected/54eef815-d11b-4203-b390-7baf2c44b620-kube-api-access-jpgtv\") pod \"volume-data-source-validator-7c6cbb6c87-8g58t\" (UID: \"54eef815-d11b-4203-b390-7baf2c44b620\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t" Apr 21 03:59:23.627701 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.627674 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpgtv\" (UniqueName: \"kubernetes.io/projected/54eef815-d11b-4203-b390-7baf2c44b620-kube-api-access-jpgtv\") pod \"volume-data-source-validator-7c6cbb6c87-8g58t\" (UID: \"54eef815-d11b-4203-b390-7baf2c44b620\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t" Apr 21 03:59:23.627822 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.627707 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7bfv\" (UniqueName: \"kubernetes.io/projected/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-kube-api-access-r7bfv\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.627822 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.627733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-snapshots\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.627822 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.627789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.627970 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.627891 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-service-ca-bundle\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.627970 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.627916 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-serving-cert\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.627970 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.627955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-tmp\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.635764 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.635740 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpgtv\" (UniqueName: \"kubernetes.io/projected/54eef815-d11b-4203-b390-7baf2c44b620-kube-api-access-jpgtv\") pod \"volume-data-source-validator-7c6cbb6c87-8g58t\" (UID: \"54eef815-d11b-4203-b390-7baf2c44b620\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t" Apr 21 03:59:23.641502 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.641485 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t" Apr 21 03:59:23.728901 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.728868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7bfv\" (UniqueName: \"kubernetes.io/projected/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-kube-api-access-r7bfv\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.729031 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.728919 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-snapshots\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.729031 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.728944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.729031 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.728969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-service-ca-bundle\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.729031 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.728991 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-serving-cert\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.729031 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.729030 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-tmp\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.729609 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.729585 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-tmp\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.729725 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.729671 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-service-ca-bundle\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.729725 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.729683 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-snapshots\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.730138 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.730114 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.731831 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.731791 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-serving-cert\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.737036 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.737011 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7bfv\" (UniqueName: \"kubernetes.io/projected/2fea1f5b-33a1-4b14-a6f9-dc99d97673d2-kube-api-access-r7bfv\") pod \"insights-operator-585dfdc468-657t5\" (UID: \"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2\") " pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.742640 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.742621 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-657t5" Apr 21 03:59:23.751663 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.751641 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t"] Apr 21 03:59:23.755354 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:59:23.755332 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54eef815_d11b_4203_b390_7baf2c44b620.slice/crio-74ccf70215ed074d5f3714090923b9124d0d913a1e5916bb39b155ea9f974713 WatchSource:0}: Error finding container 74ccf70215ed074d5f3714090923b9124d0d913a1e5916bb39b155ea9f974713: Status 404 returned error can't find the container with id 74ccf70215ed074d5f3714090923b9124d0d913a1e5916bb39b155ea9f974713 Apr 21 03:59:23.805450 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.794756 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t" event={"ID":"54eef815-d11b-4203-b390-7baf2c44b620","Type":"ContainerStarted","Data":"74ccf70215ed074d5f3714090923b9124d0d913a1e5916bb39b155ea9f974713"} Apr 21 03:59:23.880282 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:23.880186 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-657t5"] Apr 21 03:59:23.883550 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:59:23.883520 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fea1f5b_33a1_4b14_a6f9_dc99d97673d2.slice/crio-0172e7e37bf39db0714c31a1647493bf20e3f7e14184d41188581dfb989cf831 WatchSource:0}: Error finding container 0172e7e37bf39db0714c31a1647493bf20e3f7e14184d41188581dfb989cf831: Status 404 returned error can't find the container with id 0172e7e37bf39db0714c31a1647493bf20e3f7e14184d41188581dfb989cf831 Apr 21 03:59:24.798218 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:24.798164 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-657t5" event={"ID":"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2","Type":"ContainerStarted","Data":"0172e7e37bf39db0714c31a1647493bf20e3f7e14184d41188581dfb989cf831"} Apr 21 03:59:25.801825 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:25.801785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-657t5" event={"ID":"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2","Type":"ContainerStarted","Data":"d2a12f4e1786f63e48c7ab4120267a74fba0e4903b351ab6bae40b8a2491ab6e"} Apr 21 03:59:25.803161 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:25.803136 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t" event={"ID":"54eef815-d11b-4203-b390-7baf2c44b620","Type":"ContainerStarted","Data":"abbc92c865c409245e40ebac85e679bed2c8f8d6826806bedb6e8539948168f0"} Apr 21 03:59:25.816938 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:25.816892 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-657t5" podStartSLOduration=1.022093016 podStartE2EDuration="2.816878522s" podCreationTimestamp="2026-04-21 03:59:23 +0000 UTC" firstStartedPulling="2026-04-21 03:59:23.885424235 +0000 UTC m=+87.052820651" lastFinishedPulling="2026-04-21 03:59:25.680209722 +0000 UTC m=+88.847606157" observedRunningTime="2026-04-21 03:59:25.816337962 +0000 UTC m=+88.983734412" watchObservedRunningTime="2026-04-21 03:59:25.816878522 +0000 UTC m=+88.984274962" Apr 21 03:59:25.829177 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:25.829125 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8g58t" podStartSLOduration=1.506426076 podStartE2EDuration="2.829112714s" podCreationTimestamp="2026-04-21 03:59:23 +0000 UTC" firstStartedPulling="2026-04-21 03:59:23.757135749 +0000 UTC m=+86.924532166" lastFinishedPulling="2026-04-21 03:59:25.079822388 +0000 UTC m=+88.247218804" observedRunningTime="2026-04-21 03:59:25.829052538 +0000 UTC m=+88.996448975" watchObservedRunningTime="2026-04-21 03:59:25.829112714 +0000 UTC m=+88.996509153" Apr 21 03:59:27.328615 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.328586 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2"] Apr 21 03:59:27.331535 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.331516 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:27.333703 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.333678 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-qv8xx\"" Apr 21 03:59:27.334427 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.334412 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 03:59:27.334497 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.334480 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:59:27.334537 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.334522 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 03:59:27.337726 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.337706 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2"] Apr 21 03:59:27.357597 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.357565 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:27.357708 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.357691 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74spn\" (UniqueName: \"kubernetes.io/projected/6035bacf-c9f3-42ee-bbb0-49c433954da7-kube-api-access-74spn\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:27.458127 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.458092 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:27.458321 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.458212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74spn\" (UniqueName: \"kubernetes.io/projected/6035bacf-c9f3-42ee-bbb0-49c433954da7-kube-api-access-74spn\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:27.458321 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:27.458262 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:59:27.458321 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:27.458324 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls podName:6035bacf-c9f3-42ee-bbb0-49c433954da7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:27.958307461 +0000 UTC m=+91.125703881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2mhj2" (UID: "6035bacf-c9f3-42ee-bbb0-49c433954da7") : secret "samples-operator-tls" not found Apr 21 03:59:27.466151 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.466127 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74spn\" (UniqueName: \"kubernetes.io/projected/6035bacf-c9f3-42ee-bbb0-49c433954da7-kube-api-access-74spn\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:27.960876 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:27.960821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:27.961063 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:27.961002 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:59:27.961133 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:27.961086 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls podName:6035bacf-c9f3-42ee-bbb0-49c433954da7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:28.961064205 +0000 UTC m=+92.128460625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2mhj2" (UID: "6035bacf-c9f3-42ee-bbb0-49c433954da7") : secret "samples-operator-tls" not found Apr 21 03:59:28.919132 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:28.919100 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r5zq9_276dfbb7-a93a-4da8-8b3b-f919c9642bca/dns-node-resolver/0.log" Apr 21 03:59:28.967456 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:28.967427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:28.967584 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:28.967547 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:59:28.967622 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:28.967597 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls podName:6035bacf-c9f3-42ee-bbb0-49c433954da7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:30.967584808 +0000 UTC m=+94.134981228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2mhj2" (UID: "6035bacf-c9f3-42ee-bbb0-49c433954da7") : secret "samples-operator-tls" not found Apr 21 03:59:29.321297 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.321221 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bw45p_e7f6204a-5b3d-4f0f-8b89-3111e460af8a/node-ca/0.log" Apr 21 03:59:29.325946 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.325926 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-cjfz9"] Apr 21 03:59:29.328848 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.328834 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.331496 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.331474 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-bkcng\"" Apr 21 03:59:29.331629 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.331552 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 03:59:29.331904 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.331888 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 03:59:29.332100 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.332077 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 03:59:29.332254 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.332223 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:59:29.335521 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.335488 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 03:59:29.336263 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.336034 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-cjfz9"] Apr 21 03:59:29.370694 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.370669 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f274aa3-e379-471c-9815-cb6212a1b1ef-config\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.370824 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.370697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f274aa3-e379-471c-9815-cb6212a1b1ef-serving-cert\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.370824 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.370718 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f274aa3-e379-471c-9815-cb6212a1b1ef-trusted-ca\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.370824 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.370740 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzfdj\" (UniqueName: \"kubernetes.io/projected/6f274aa3-e379-471c-9815-cb6212a1b1ef-kube-api-access-pzfdj\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.471055 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.471027 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f274aa3-e379-471c-9815-cb6212a1b1ef-config\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.471055 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.471059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f274aa3-e379-471c-9815-cb6212a1b1ef-serving-cert\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.471291 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.471076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f274aa3-e379-471c-9815-cb6212a1b1ef-trusted-ca\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.471291 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.471095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzfdj\" (UniqueName: \"kubernetes.io/projected/6f274aa3-e379-471c-9815-cb6212a1b1ef-kube-api-access-pzfdj\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.471665 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.471645 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f274aa3-e379-471c-9815-cb6212a1b1ef-config\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.471775 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.471757 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6f274aa3-e379-471c-9815-cb6212a1b1ef-trusted-ca\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.473647 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.473626 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f274aa3-e379-471c-9815-cb6212a1b1ef-serving-cert\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.478678 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.478656 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzfdj\" (UniqueName: \"kubernetes.io/projected/6f274aa3-e379-471c-9815-cb6212a1b1ef-kube-api-access-pzfdj\") pod \"console-operator-9d4b6777b-cjfz9\" (UID: \"6f274aa3-e379-471c-9815-cb6212a1b1ef\") " pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.638791 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.638759 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:29.754794 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.754767 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-cjfz9"] Apr 21 03:59:29.757927 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:59:29.757895 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f274aa3_e379_471c_9815_cb6212a1b1ef.slice/crio-1c240680aba7bec7f598b863ded95bc5bdbd00b56c758cd72a2806c73f7b4f8b WatchSource:0}: Error finding container 1c240680aba7bec7f598b863ded95bc5bdbd00b56c758cd72a2806c73f7b4f8b: Status 404 returned error can't find the container with id 1c240680aba7bec7f598b863ded95bc5bdbd00b56c758cd72a2806c73f7b4f8b Apr 21 03:59:29.810601 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:29.810576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" event={"ID":"6f274aa3-e379-471c-9815-cb6212a1b1ef","Type":"ContainerStarted","Data":"1c240680aba7bec7f598b863ded95bc5bdbd00b56c758cd72a2806c73f7b4f8b"} Apr 21 03:59:30.979950 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:30.979898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:30.980450 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:30.980052 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:59:30.980450 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:30.980133 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls podName:6035bacf-c9f3-42ee-bbb0-49c433954da7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:34.98011282 +0000 UTC m=+98.147509237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2mhj2" (UID: "6035bacf-c9f3-42ee-bbb0-49c433954da7") : secret "samples-operator-tls" not found Apr 21 03:59:31.815438 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:31.815412 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/0.log" Apr 21 03:59:31.815638 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:31.815452 2569 generic.go:358] "Generic (PLEG): container finished" podID="6f274aa3-e379-471c-9815-cb6212a1b1ef" containerID="80184aa5236d0cde7ac175e0854184a82ddf7d93ff9e5d0d9e8e2201edb87cf5" exitCode=255 Apr 21 03:59:31.815638 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:31.815508 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" event={"ID":"6f274aa3-e379-471c-9815-cb6212a1b1ef","Type":"ContainerDied","Data":"80184aa5236d0cde7ac175e0854184a82ddf7d93ff9e5d0d9e8e2201edb87cf5"} Apr 21 03:59:31.815803 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:31.815785 2569 scope.go:117] "RemoveContainer" containerID="80184aa5236d0cde7ac175e0854184a82ddf7d93ff9e5d0d9e8e2201edb87cf5" Apr 21 03:59:32.818884 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:32.818857 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 03:59:32.819277 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:32.819201 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/0.log" Apr 21 03:59:32.819277 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:32.819256 2569 generic.go:358] "Generic (PLEG): container finished" podID="6f274aa3-e379-471c-9815-cb6212a1b1ef" containerID="e1bc6b5ebb4b33df44640cb933543bfbaab3796c29ed3f73eaf3d2263d2709d1" exitCode=255 Apr 21 03:59:32.819385 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:32.819328 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" event={"ID":"6f274aa3-e379-471c-9815-cb6212a1b1ef","Type":"ContainerDied","Data":"e1bc6b5ebb4b33df44640cb933543bfbaab3796c29ed3f73eaf3d2263d2709d1"} Apr 21 03:59:32.819385 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:32.819366 2569 scope.go:117] "RemoveContainer" containerID="80184aa5236d0cde7ac175e0854184a82ddf7d93ff9e5d0d9e8e2201edb87cf5" Apr 21 03:59:32.819545 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:32.819527 2569 scope.go:117] "RemoveContainer" containerID="e1bc6b5ebb4b33df44640cb933543bfbaab3796c29ed3f73eaf3d2263d2709d1" Apr 21 03:59:32.819762 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:32.819741 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-cjfz9_openshift-console-operator(6f274aa3-e379-471c-9815-cb6212a1b1ef)\"" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" podUID="6f274aa3-e379-471c-9815-cb6212a1b1ef" Apr 21 03:59:33.327696 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.327652 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr"] Apr 21 03:59:33.330637 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.330622 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.332895 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.332868 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 03:59:33.333006 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.332898 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 03:59:33.333648 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.333626 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-f8dr8\"" Apr 21 03:59:33.333752 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.333699 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 03:59:33.333752 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.333709 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 03:59:33.341227 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.341203 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr"] Apr 21 03:59:33.394972 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.394949 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ac9e93-0c06-48fe-bfdb-ba6d11fedd31-config\") pod \"service-ca-operator-d6fc45fc5-h8mkr\" (UID: \"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.395093 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.394981 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4m6\" (UniqueName: \"kubernetes.io/projected/15ac9e93-0c06-48fe-bfdb-ba6d11fedd31-kube-api-access-hd4m6\") pod \"service-ca-operator-d6fc45fc5-h8mkr\" (UID: \"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.395093 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.395033 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15ac9e93-0c06-48fe-bfdb-ba6d11fedd31-serving-cert\") pod \"service-ca-operator-d6fc45fc5-h8mkr\" (UID: \"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.496004 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.495984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15ac9e93-0c06-48fe-bfdb-ba6d11fedd31-serving-cert\") pod \"service-ca-operator-d6fc45fc5-h8mkr\" (UID: \"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.496143 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.496033 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ac9e93-0c06-48fe-bfdb-ba6d11fedd31-config\") pod \"service-ca-operator-d6fc45fc5-h8mkr\" (UID: \"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.496143 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.496053 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4m6\" (UniqueName: \"kubernetes.io/projected/15ac9e93-0c06-48fe-bfdb-ba6d11fedd31-kube-api-access-hd4m6\") pod \"service-ca-operator-d6fc45fc5-h8mkr\" (UID: \"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.496579 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.496558 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15ac9e93-0c06-48fe-bfdb-ba6d11fedd31-config\") pod \"service-ca-operator-d6fc45fc5-h8mkr\" (UID: \"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.498331 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.498315 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15ac9e93-0c06-48fe-bfdb-ba6d11fedd31-serving-cert\") pod \"service-ca-operator-d6fc45fc5-h8mkr\" (UID: \"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.504041 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.504015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4m6\" (UniqueName: \"kubernetes.io/projected/15ac9e93-0c06-48fe-bfdb-ba6d11fedd31-kube-api-access-hd4m6\") pod \"service-ca-operator-d6fc45fc5-h8mkr\" (UID: \"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.639446 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.639413 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" Apr 21 03:59:33.753724 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.753690 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr"] Apr 21 03:59:33.756827 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:59:33.756794 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15ac9e93_0c06_48fe_bfdb_ba6d11fedd31.slice/crio-3130c971ad049871f690466dde99d2b7595e2687fe7eced12a666968bcdd1c12 WatchSource:0}: Error finding container 3130c971ad049871f690466dde99d2b7595e2687fe7eced12a666968bcdd1c12: Status 404 returned error can't find the container with id 3130c971ad049871f690466dde99d2b7595e2687fe7eced12a666968bcdd1c12 Apr 21 03:59:33.822110 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.822086 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 03:59:33.822522 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.822506 2569 scope.go:117] "RemoveContainer" containerID="e1bc6b5ebb4b33df44640cb933543bfbaab3796c29ed3f73eaf3d2263d2709d1" Apr 21 03:59:33.822731 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:33.822706 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-cjfz9_openshift-console-operator(6f274aa3-e379-471c-9815-cb6212a1b1ef)\"" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" podUID="6f274aa3-e379-471c-9815-cb6212a1b1ef" Apr 21 03:59:33.823425 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:33.823402 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" event={"ID":"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31","Type":"ContainerStarted","Data":"3130c971ad049871f690466dde99d2b7595e2687fe7eced12a666968bcdd1c12"} Apr 21 03:59:34.504887 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:34.504848 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 03:59:34.505057 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:34.504902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 03:59:34.505057 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:34.505006 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 03:59:34.505057 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:34.505009 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 03:59:34.505225 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:34.505075 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert podName:4568e112-4a47-4610-a776-ed8ac69d2ca9 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:38.505055566 +0000 UTC m=+161.672451986 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert") pod "ingress-canary-xmfjq" (UID: "4568e112-4a47-4610-a776-ed8ac69d2ca9") : secret "canary-serving-cert" not found Apr 21 03:59:34.505225 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:34.505092 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls podName:582f3951-c97b-47a0-9cb1-39d85f78b692 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:38.505083519 +0000 UTC m=+161.672479936 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls") pod "dns-default-p2sxx" (UID: "582f3951-c97b-47a0-9cb1-39d85f78b692") : secret "dns-default-metrics-tls" not found Apr 21 03:59:35.008457 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:35.008419 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:35.008851 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:35.008586 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 03:59:35.008851 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:35.008663 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls podName:6035bacf-c9f3-42ee-bbb0-49c433954da7 nodeName:}" failed. No retries permitted until 2026-04-21 03:59:43.008644625 +0000 UTC m=+106.176041042 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2mhj2" (UID: "6035bacf-c9f3-42ee-bbb0-49c433954da7") : secret "samples-operator-tls" not found Apr 21 03:59:35.828915 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:35.828838 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" event={"ID":"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31","Type":"ContainerStarted","Data":"e0b1b6ce29b542201494b99ad3a3593ec1397af60db39b5960340f6e0409a656"} Apr 21 03:59:35.843677 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:35.843639 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" podStartSLOduration=1.219365478 podStartE2EDuration="2.843627116s" podCreationTimestamp="2026-04-21 03:59:33 +0000 UTC" firstStartedPulling="2026-04-21 03:59:33.758602114 +0000 UTC m=+96.925998530" lastFinishedPulling="2026-04-21 03:59:35.382863735 +0000 UTC m=+98.550260168" observedRunningTime="2026-04-21 03:59:35.842314858 +0000 UTC m=+99.009711297" watchObservedRunningTime="2026-04-21 03:59:35.843627116 +0000 UTC m=+99.011023587" Apr 21 03:59:36.036110 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.036071 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-55n56"] Apr 21 03:59:36.039314 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.039297 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-55n56" Apr 21 03:59:36.041397 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.041376 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-zkf7k\"" Apr 21 03:59:36.044825 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.044801 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-55n56"] Apr 21 03:59:36.117563 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.117531 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g752\" (UniqueName: \"kubernetes.io/projected/79dfa283-9e0e-4895-ba2d-3c4af2c1bbd6-kube-api-access-6g752\") pod \"network-check-source-8894fc9bd-55n56\" (UID: \"79dfa283-9e0e-4895-ba2d-3c4af2c1bbd6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-55n56" Apr 21 03:59:36.218810 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.218782 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g752\" (UniqueName: \"kubernetes.io/projected/79dfa283-9e0e-4895-ba2d-3c4af2c1bbd6-kube-api-access-6g752\") pod \"network-check-source-8894fc9bd-55n56\" (UID: \"79dfa283-9e0e-4895-ba2d-3c4af2c1bbd6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-55n56" Apr 21 03:59:36.226843 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.226817 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g752\" (UniqueName: \"kubernetes.io/projected/79dfa283-9e0e-4895-ba2d-3c4af2c1bbd6-kube-api-access-6g752\") pod \"network-check-source-8894fc9bd-55n56\" (UID: \"79dfa283-9e0e-4895-ba2d-3c4af2c1bbd6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-55n56" Apr 21 03:59:36.348048 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.348010 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-55n56" Apr 21 03:59:36.464233 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.464206 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-55n56"] Apr 21 03:59:36.467084 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:59:36.467060 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79dfa283_9e0e_4895_ba2d_3c4af2c1bbd6.slice/crio-89ff2f6b57b369b9c21ab1580de127a9c3ad7a914ad731a84d6935c8d7a50f8f WatchSource:0}: Error finding container 89ff2f6b57b369b9c21ab1580de127a9c3ad7a914ad731a84d6935c8d7a50f8f: Status 404 returned error can't find the container with id 89ff2f6b57b369b9c21ab1580de127a9c3ad7a914ad731a84d6935c8d7a50f8f Apr 21 03:59:36.833057 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.832973 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-55n56" event={"ID":"79dfa283-9e0e-4895-ba2d-3c4af2c1bbd6","Type":"ContainerStarted","Data":"29888edbc45804a64972bf6518e9592882a71fb16f3d69bea2f0398f9d604005"} Apr 21 03:59:36.833057 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.833007 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-55n56" event={"ID":"79dfa283-9e0e-4895-ba2d-3c4af2c1bbd6","Type":"ContainerStarted","Data":"89ff2f6b57b369b9c21ab1580de127a9c3ad7a914ad731a84d6935c8d7a50f8f"} Apr 21 03:59:36.846564 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:36.846520 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-55n56" podStartSLOduration=0.846507577 podStartE2EDuration="846.507577ms" podCreationTimestamp="2026-04-21 03:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:59:36.846497639 +0000 UTC m=+100.013894078" watchObservedRunningTime="2026-04-21 03:59:36.846507577 +0000 UTC m=+100.013904011" Apr 21 03:59:37.764985 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:37.764959 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vsvml" Apr 21 03:59:39.638872 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.638837 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:39.638872 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.638873 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:39.639321 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.639225 2569 scope.go:117] "RemoveContainer" containerID="e1bc6b5ebb4b33df44640cb933543bfbaab3796c29ed3f73eaf3d2263d2709d1" Apr 21 03:59:39.639424 ip-10-0-138-120 kubenswrapper[2569]: E0421 03:59:39.639406 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-cjfz9_openshift-console-operator(6f274aa3-e379-471c-9815-cb6212a1b1ef)\"" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" podUID="6f274aa3-e379-471c-9815-cb6212a1b1ef" Apr 21 03:59:39.669081 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.669055 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fmcxv"] Apr 21 03:59:39.706867 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.706841 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fmcxv"] Apr 21 03:59:39.706967 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.706942 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:39.709762 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.709748 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 03:59:39.710433 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.710411 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 03:59:39.710433 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.710435 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 03:59:39.710571 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.710436 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 03:59:39.710571 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.710492 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-xr6dz\"" Apr 21 03:59:39.742479 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.742456 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/416d9de7-6589-4d72-b2c4-e0f4d6db0ed3-signing-key\") pod \"service-ca-865cb79987-fmcxv\" (UID: \"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3\") " pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:39.742574 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.742523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/416d9de7-6589-4d72-b2c4-e0f4d6db0ed3-signing-cabundle\") pod \"service-ca-865cb79987-fmcxv\" (UID: \"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3\") " pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:39.742574 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.742562 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvv58\" (UniqueName: \"kubernetes.io/projected/416d9de7-6589-4d72-b2c4-e0f4d6db0ed3-kube-api-access-fvv58\") pod \"service-ca-865cb79987-fmcxv\" (UID: \"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3\") " pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:39.843658 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.843630 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/416d9de7-6589-4d72-b2c4-e0f4d6db0ed3-signing-cabundle\") pod \"service-ca-865cb79987-fmcxv\" (UID: \"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3\") " pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:39.843658 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.843663 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvv58\" (UniqueName: \"kubernetes.io/projected/416d9de7-6589-4d72-b2c4-e0f4d6db0ed3-kube-api-access-fvv58\") pod \"service-ca-865cb79987-fmcxv\" (UID: \"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3\") " pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:39.843811 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.843732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/416d9de7-6589-4d72-b2c4-e0f4d6db0ed3-signing-key\") pod \"service-ca-865cb79987-fmcxv\" (UID: \"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3\") " pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:39.844385 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.844369 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/416d9de7-6589-4d72-b2c4-e0f4d6db0ed3-signing-cabundle\") pod \"service-ca-865cb79987-fmcxv\" (UID: \"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3\") " pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:39.846324 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.846307 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/416d9de7-6589-4d72-b2c4-e0f4d6db0ed3-signing-key\") pod \"service-ca-865cb79987-fmcxv\" (UID: \"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3\") " pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:39.852100 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:39.852073 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvv58\" (UniqueName: \"kubernetes.io/projected/416d9de7-6589-4d72-b2c4-e0f4d6db0ed3-kube-api-access-fvv58\") pod \"service-ca-865cb79987-fmcxv\" (UID: \"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3\") " pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:40.015899 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:40.015810 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-fmcxv" Apr 21 03:59:40.129093 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:40.129064 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-fmcxv"] Apr 21 03:59:40.132201 ip-10-0-138-120 kubenswrapper[2569]: W0421 03:59:40.132174 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod416d9de7_6589_4d72_b2c4_e0f4d6db0ed3.slice/crio-3d37f39afdf14342876d65f2413e77e9bc2241d7dbbcb03a38a95b1911c9043e WatchSource:0}: Error finding container 3d37f39afdf14342876d65f2413e77e9bc2241d7dbbcb03a38a95b1911c9043e: Status 404 returned error can't find the container with id 3d37f39afdf14342876d65f2413e77e9bc2241d7dbbcb03a38a95b1911c9043e Apr 21 03:59:40.844531 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:40.844497 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fmcxv" event={"ID":"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3","Type":"ContainerStarted","Data":"53d783f8ed8115e27c7470b8b761038f08a7d25839c84c910f3cbe56dddafb14"} Apr 21 03:59:40.844531 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:40.844530 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-fmcxv" event={"ID":"416d9de7-6589-4d72-b2c4-e0f4d6db0ed3","Type":"ContainerStarted","Data":"3d37f39afdf14342876d65f2413e77e9bc2241d7dbbcb03a38a95b1911c9043e"} Apr 21 03:59:40.859903 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:40.859853 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-fmcxv" podStartSLOduration=1.8598388940000001 podStartE2EDuration="1.859838894s" podCreationTimestamp="2026-04-21 03:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 03:59:40.859269975 +0000 UTC m=+104.026666414" watchObservedRunningTime="2026-04-21 03:59:40.859838894 +0000 UTC m=+104.027235332" Apr 21 03:59:43.068764 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:43.068729 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:43.071211 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:43.071186 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6035bacf-c9f3-42ee-bbb0-49c433954da7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2mhj2\" (UID: \"6035bacf-c9f3-42ee-bbb0-49c433954da7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:43.241254 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:43.241207 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" Apr 21 03:59:43.358405 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:43.358317 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2"] Apr 21 03:59:43.858981 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:43.858942 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" event={"ID":"6035bacf-c9f3-42ee-bbb0-49c433954da7","Type":"ContainerStarted","Data":"d1e95b508e7e30686869d86a27d17b851b8cfe9bc412fb91158706e5be2e36c4"} Apr 21 03:59:44.863176 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:44.863141 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" event={"ID":"6035bacf-c9f3-42ee-bbb0-49c433954da7","Type":"ContainerStarted","Data":"3f8542fba36733f9310a300223d74314722f30b202c7270a52cd915c3d4678fc"} Apr 21 03:59:44.863176 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:44.863179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" event={"ID":"6035bacf-c9f3-42ee-bbb0-49c433954da7","Type":"ContainerStarted","Data":"f2730509c0395b7ea1dfd481fbbfeaf1e9046c51d05dd63de058de63edd95ede"} Apr 21 03:59:44.878604 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:44.878560 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2mhj2" podStartSLOduration=16.574892994 podStartE2EDuration="17.87854653s" podCreationTimestamp="2026-04-21 03:59:27 +0000 UTC" firstStartedPulling="2026-04-21 03:59:43.401781037 +0000 UTC m=+106.569177455" lastFinishedPulling="2026-04-21 03:59:44.705434573 +0000 UTC m=+107.872830991" observedRunningTime="2026-04-21 03:59:44.877269748 +0000 UTC m=+108.044666185" watchObservedRunningTime="2026-04-21 03:59:44.87854653 +0000 UTC m=+108.045942968" Apr 21 03:59:51.466758 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:51.466722 2569 scope.go:117] "RemoveContainer" containerID="e1bc6b5ebb4b33df44640cb933543bfbaab3796c29ed3f73eaf3d2263d2709d1" Apr 21 03:59:51.885577 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:51.885551 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 03:59:51.885712 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:51.885641 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" event={"ID":"6f274aa3-e379-471c-9815-cb6212a1b1ef","Type":"ContainerStarted","Data":"bff49b9f0f08f307d8e642f15ea1c9ad1d39c49db9055d73354798cb3156afb4"} Apr 21 03:59:51.885938 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:51.885916 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 03:59:51.903230 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:51.903187 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" podStartSLOduration=21.439258234 podStartE2EDuration="22.903176475s" podCreationTimestamp="2026-04-21 03:59:29 +0000 UTC" firstStartedPulling="2026-04-21 03:59:29.759640898 +0000 UTC m=+92.927037314" lastFinishedPulling="2026-04-21 03:59:31.223559137 +0000 UTC m=+94.390955555" observedRunningTime="2026-04-21 03:59:51.902514477 +0000 UTC m=+115.069910909" watchObservedRunningTime="2026-04-21 03:59:51.903176475 +0000 UTC m=+115.070572912" Apr 21 03:59:51.950363 ip-10-0-138-120 kubenswrapper[2569]: I0421 03:59:51.950335 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-cjfz9" Apr 21 04:00:00.218008 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.217966 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7nh4z"] Apr 21 04:00:00.222755 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.222733 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.225174 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.225155 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 04:00:00.226138 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.226112 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 04:00:00.226252 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.226158 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-tgq7l\"" Apr 21 04:00:00.230699 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.230677 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7nh4z"] Apr 21 04:00:00.298530 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.298496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/12e2811d-0a97-4e4c-a872-22799a21a2c4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.298681 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.298557 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5lp9\" (UniqueName: \"kubernetes.io/projected/12e2811d-0a97-4e4c-a872-22799a21a2c4-kube-api-access-h5lp9\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.298681 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.298604 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/12e2811d-0a97-4e4c-a872-22799a21a2c4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.298755 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.298675 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/12e2811d-0a97-4e4c-a872-22799a21a2c4-crio-socket\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.298755 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.298704 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/12e2811d-0a97-4e4c-a872-22799a21a2c4-data-volume\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.323392 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.323339 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7c98bb8c4b-66m2p"] Apr 21 04:00:00.326470 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.326450 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f"] Apr 21 04:00:00.326638 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.326618 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.329336 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.329317 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 04:00:00.329456 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.329440 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 04:00:00.329543 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.329525 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" Apr 21 04:00:00.329801 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.329778 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fzhgd\"" Apr 21 04:00:00.330560 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.330543 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 04:00:00.331934 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.331918 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-wz5gc\"" Apr 21 04:00:00.333559 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.333541 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 04:00:00.334292 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.334269 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 04:00:00.339830 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.339791 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f"] Apr 21 04:00:00.343663 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.343639 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c98bb8c4b-66m2p"] Apr 21 04:00:00.399909 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.399877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a908e1d-c424-42bf-9ffc-a48edafa8faa-ca-trust-extracted\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.400070 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.399917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/12e2811d-0a97-4e4c-a872-22799a21a2c4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.400070 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.399942 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a908e1d-c424-42bf-9ffc-a48edafa8faa-registry-tls\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.400070 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400020 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a908e1d-c424-42bf-9ffc-a48edafa8faa-bound-sa-token\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.400070 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt9gz\" (UniqueName: \"kubernetes.io/projected/9a908e1d-c424-42bf-9ffc-a48edafa8faa-kube-api-access-zt9gz\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.400263 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400085 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5lp9\" (UniqueName: \"kubernetes.io/projected/12e2811d-0a97-4e4c-a872-22799a21a2c4-kube-api-access-h5lp9\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.400263 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400122 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/12e2811d-0a97-4e4c-a872-22799a21a2c4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.400263 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a908e1d-c424-42bf-9ffc-a48edafa8faa-installation-pull-secrets\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.400263 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400167 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/12e2811d-0a97-4e4c-a872-22799a21a2c4-crio-socket\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.400263 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/12e2811d-0a97-4e4c-a872-22799a21a2c4-data-volume\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.400263 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9a908e1d-c424-42bf-9ffc-a48edafa8faa-image-registry-private-configuration\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.400508 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400265 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a908e1d-c424-42bf-9ffc-a48edafa8faa-trusted-ca\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.400508 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/12e2811d-0a97-4e4c-a872-22799a21a2c4-crio-socket\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.400508 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8a74424a-0a97-406e-a9ab-19b6f495be26-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w6x4f\" (UID: \"8a74424a-0a97-406e-a9ab-19b6f495be26\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" Apr 21 04:00:00.400508 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400378 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a908e1d-c424-42bf-9ffc-a48edafa8faa-registry-certificates\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.400629 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.400543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/12e2811d-0a97-4e4c-a872-22799a21a2c4-data-volume\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.401153 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.401132 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/12e2811d-0a97-4e4c-a872-22799a21a2c4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.402552 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.402535 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/12e2811d-0a97-4e4c-a872-22799a21a2c4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.407533 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.407482 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5lp9\" (UniqueName: \"kubernetes.io/projected/12e2811d-0a97-4e4c-a872-22799a21a2c4-kube-api-access-h5lp9\") pod \"insights-runtime-extractor-7nh4z\" (UID: \"12e2811d-0a97-4e4c-a872-22799a21a2c4\") " pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.500998 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.500912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a908e1d-c424-42bf-9ffc-a48edafa8faa-registry-tls\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.500998 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.500955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a908e1d-c424-42bf-9ffc-a48edafa8faa-bound-sa-token\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.500998 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.500977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zt9gz\" (UniqueName: \"kubernetes.io/projected/9a908e1d-c424-42bf-9ffc-a48edafa8faa-kube-api-access-zt9gz\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.501285 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.501086 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a908e1d-c424-42bf-9ffc-a48edafa8faa-installation-pull-secrets\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.501285 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.501124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9a908e1d-c424-42bf-9ffc-a48edafa8faa-image-registry-private-configuration\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.501285 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.501142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a908e1d-c424-42bf-9ffc-a48edafa8faa-trusted-ca\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.501455 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.501315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8a74424a-0a97-406e-a9ab-19b6f495be26-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w6x4f\" (UID: \"8a74424a-0a97-406e-a9ab-19b6f495be26\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" Apr 21 04:00:00.501455 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.501368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a908e1d-c424-42bf-9ffc-a48edafa8faa-registry-certificates\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.501455 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.501409 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a908e1d-c424-42bf-9ffc-a48edafa8faa-ca-trust-extracted\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.501860 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.501778 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9a908e1d-c424-42bf-9ffc-a48edafa8faa-ca-trust-extracted\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.502294 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.502272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9a908e1d-c424-42bf-9ffc-a48edafa8faa-trusted-ca\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.502844 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.502819 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9a908e1d-c424-42bf-9ffc-a48edafa8faa-registry-certificates\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.503983 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.503961 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8a74424a-0a97-406e-a9ab-19b6f495be26-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-w6x4f\" (UID: \"8a74424a-0a97-406e-a9ab-19b6f495be26\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" Apr 21 04:00:00.503983 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.503973 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9a908e1d-c424-42bf-9ffc-a48edafa8faa-registry-tls\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.504104 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.503990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9a908e1d-c424-42bf-9ffc-a48edafa8faa-installation-pull-secrets\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.504305 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.504288 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9a908e1d-c424-42bf-9ffc-a48edafa8faa-image-registry-private-configuration\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.511494 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.511466 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9a908e1d-c424-42bf-9ffc-a48edafa8faa-bound-sa-token\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.511655 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.511638 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt9gz\" (UniqueName: \"kubernetes.io/projected/9a908e1d-c424-42bf-9ffc-a48edafa8faa-kube-api-access-zt9gz\") pod \"image-registry-7c98bb8c4b-66m2p\" (UID: \"9a908e1d-c424-42bf-9ffc-a48edafa8faa\") " pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.532054 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.532032 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7nh4z" Apr 21 04:00:00.637568 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.637538 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.644375 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.644342 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" Apr 21 04:00:00.647602 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.647576 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69b444ff5d-wzs8b"] Apr 21 04:00:00.652679 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.652660 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.656502 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.656476 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 04:00:00.656621 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.656530 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 04:00:00.656621 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.656482 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 04:00:00.656736 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.656478 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 04:00:00.656736 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.656715 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 04:00:00.656835 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.656765 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 04:00:00.656835 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.656816 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 04:00:00.657150 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.656971 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-g7wts\"" Apr 21 04:00:00.657862 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.657838 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7nh4z"] Apr 21 04:00:00.663973 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.661720 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b444ff5d-wzs8b"] Apr 21 04:00:00.666100 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:00.666073 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e2811d_0a97_4e4c_a872_22799a21a2c4.slice/crio-3e1a1c240543f44618095739f59e19d3f0d935dafb2ca9adc9d079b55dafd513 WatchSource:0}: Error finding container 3e1a1c240543f44618095739f59e19d3f0d935dafb2ca9adc9d079b55dafd513: Status 404 returned error can't find the container with id 3e1a1c240543f44618095739f59e19d3f0d935dafb2ca9adc9d079b55dafd513 Apr 21 04:00:00.703342 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.702952 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-service-ca\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.703342 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.703002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-oauth-config\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.703342 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.703030 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-console-config\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.703342 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.703123 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czxrq\" (UniqueName: \"kubernetes.io/projected/f80e7119-50ad-410d-9929-88d98ee3357c-kube-api-access-czxrq\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.703342 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.703166 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-serving-cert\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.703342 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.703189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-oauth-serving-cert\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.782819 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.782794 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c98bb8c4b-66m2p"] Apr 21 04:00:00.785159 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:00.785123 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a908e1d_c424_42bf_9ffc_a48edafa8faa.slice/crio-2bd1f5e059c2e78a3aec3431414b4684563826aae384ca3c3d2902528850112e WatchSource:0}: Error finding container 2bd1f5e059c2e78a3aec3431414b4684563826aae384ca3c3d2902528850112e: Status 404 returned error can't find the container with id 2bd1f5e059c2e78a3aec3431414b4684563826aae384ca3c3d2902528850112e Apr 21 04:00:00.798254 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.798217 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f"] Apr 21 04:00:00.800711 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:00.800687 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a74424a_0a97_406e_a9ab_19b6f495be26.slice/crio-677fcb07a09052e323011501afb7611c4296fcc6e30f0e9c5dd98f65ca9f3dec WatchSource:0}: Error finding container 677fcb07a09052e323011501afb7611c4296fcc6e30f0e9c5dd98f65ca9f3dec: Status 404 returned error can't find the container with id 677fcb07a09052e323011501afb7611c4296fcc6e30f0e9c5dd98f65ca9f3dec Apr 21 04:00:00.804171 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.804148 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czxrq\" (UniqueName: \"kubernetes.io/projected/f80e7119-50ad-410d-9929-88d98ee3357c-kube-api-access-czxrq\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.804233 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.804196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-serving-cert\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.804328 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.804228 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-oauth-serving-cert\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.804328 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.804316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-service-ca\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.804403 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.804350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-oauth-config\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.804403 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.804375 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-console-config\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.805115 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.805063 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-oauth-serving-cert\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.805115 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.805063 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-service-ca\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.805233 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.805125 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-console-config\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.806662 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.806642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-oauth-config\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.806805 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.806787 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-serving-cert\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.811431 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.811411 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czxrq\" (UniqueName: \"kubernetes.io/projected/f80e7119-50ad-410d-9929-88d98ee3357c-kube-api-access-czxrq\") pod \"console-69b444ff5d-wzs8b\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:00.909216 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.909179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" event={"ID":"9a908e1d-c424-42bf-9ffc-a48edafa8faa","Type":"ContainerStarted","Data":"fba8bc257fc8375d255acff1c19850985333f345d375b417b26232050c427baa"} Apr 21 04:00:00.909422 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.909223 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" event={"ID":"9a908e1d-c424-42bf-9ffc-a48edafa8faa","Type":"ContainerStarted","Data":"2bd1f5e059c2e78a3aec3431414b4684563826aae384ca3c3d2902528850112e"} Apr 21 04:00:00.909422 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.909285 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:00.910528 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.910500 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7nh4z" event={"ID":"12e2811d-0a97-4e4c-a872-22799a21a2c4","Type":"ContainerStarted","Data":"48c6f35224ba3cdf5ef4d8aa51b10f7ee7893782ff3148a3bc29acafdb34d835"} Apr 21 04:00:00.910528 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.910529 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7nh4z" event={"ID":"12e2811d-0a97-4e4c-a872-22799a21a2c4","Type":"ContainerStarted","Data":"3e1a1c240543f44618095739f59e19d3f0d935dafb2ca9adc9d079b55dafd513"} Apr 21 04:00:00.911514 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.911497 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" event={"ID":"8a74424a-0a97-406e-a9ab-19b6f495be26","Type":"ContainerStarted","Data":"677fcb07a09052e323011501afb7611c4296fcc6e30f0e9c5dd98f65ca9f3dec"} Apr 21 04:00:00.927762 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.927724 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" podStartSLOduration=0.927710202 podStartE2EDuration="927.710202ms" podCreationTimestamp="2026-04-21 04:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:00:00.926769024 +0000 UTC m=+124.094165463" watchObservedRunningTime="2026-04-21 04:00:00.927710202 +0000 UTC m=+124.095106641" Apr 21 04:00:00.967097 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:00.967062 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:01.110871 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:01.110846 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69b444ff5d-wzs8b"] Apr 21 04:00:01.112963 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:01.112932 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80e7119_50ad_410d_9929_88d98ee3357c.slice/crio-3b5f40bcd9a53430f0d8dee4f5cebabeec091392b273bd5e7c947c651bc1357f WatchSource:0}: Error finding container 3b5f40bcd9a53430f0d8dee4f5cebabeec091392b273bd5e7c947c651bc1357f: Status 404 returned error can't find the container with id 3b5f40bcd9a53430f0d8dee4f5cebabeec091392b273bd5e7c947c651bc1357f Apr 21 04:00:01.915318 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:01.915283 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7nh4z" event={"ID":"12e2811d-0a97-4e4c-a872-22799a21a2c4","Type":"ContainerStarted","Data":"b8ceb795c57cd9ccb636417f48d63eb76e17fa77a3412f67125b101530b9b950"} Apr 21 04:00:01.916289 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:01.916271 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b444ff5d-wzs8b" event={"ID":"f80e7119-50ad-410d-9929-88d98ee3357c","Type":"ContainerStarted","Data":"3b5f40bcd9a53430f0d8dee4f5cebabeec091392b273bd5e7c947c651bc1357f"} Apr 21 04:00:01.917438 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:01.917410 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" event={"ID":"8a74424a-0a97-406e-a9ab-19b6f495be26","Type":"ContainerStarted","Data":"d9350a9b272dd5e61532be7635a875f147c87edc36eb116b1f7937898bdc3ee6"} Apr 21 04:00:01.932756 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:01.932717 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" podStartSLOduration=0.957984433 podStartE2EDuration="1.932705542s" podCreationTimestamp="2026-04-21 04:00:00 +0000 UTC" firstStartedPulling="2026-04-21 04:00:00.802569463 +0000 UTC m=+123.969965892" lastFinishedPulling="2026-04-21 04:00:01.777290585 +0000 UTC m=+124.944687001" observedRunningTime="2026-04-21 04:00:01.932126023 +0000 UTC m=+125.099522439" watchObservedRunningTime="2026-04-21 04:00:01.932705542 +0000 UTC m=+125.100101979" Apr 21 04:00:02.920571 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:02.920531 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" Apr 21 04:00:02.926418 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:02.926393 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-w6x4f" Apr 21 04:00:04.928167 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:04.928131 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7nh4z" event={"ID":"12e2811d-0a97-4e4c-a872-22799a21a2c4","Type":"ContainerStarted","Data":"a1c8544324d5c03b131f74d3cab3184448da524c95ff7f1a5f028e385a68000a"} Apr 21 04:00:04.929538 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:04.929509 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b444ff5d-wzs8b" event={"ID":"f80e7119-50ad-410d-9929-88d98ee3357c","Type":"ContainerStarted","Data":"4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d"} Apr 21 04:00:04.944514 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:04.944472 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7nh4z" podStartSLOduration=1.3723089960000001 podStartE2EDuration="4.944458197s" podCreationTimestamp="2026-04-21 04:00:00 +0000 UTC" firstStartedPulling="2026-04-21 04:00:00.739080491 +0000 UTC m=+123.906476908" lastFinishedPulling="2026-04-21 04:00:04.311229681 +0000 UTC m=+127.478626109" observedRunningTime="2026-04-21 04:00:04.942740299 +0000 UTC m=+128.110136776" watchObservedRunningTime="2026-04-21 04:00:04.944458197 +0000 UTC m=+128.111854695" Apr 21 04:00:04.961977 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:04.961934 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69b444ff5d-wzs8b" podStartSLOduration=1.7216071240000002 podStartE2EDuration="4.961922184s" podCreationTimestamp="2026-04-21 04:00:00 +0000 UTC" firstStartedPulling="2026-04-21 04:00:01.114822882 +0000 UTC m=+124.282219301" lastFinishedPulling="2026-04-21 04:00:04.355137943 +0000 UTC m=+127.522534361" observedRunningTime="2026-04-21 04:00:04.960920128 +0000 UTC m=+128.128316578" watchObservedRunningTime="2026-04-21 04:00:04.961922184 +0000 UTC m=+128.129318621" Apr 21 04:00:07.160032 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:07.159993 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 04:00:07.162920 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:07.162895 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3-metrics-certs\") pod \"network-metrics-daemon-cxzzc\" (UID: \"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3\") " pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 04:00:07.277212 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:07.277183 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k29s5\"" Apr 21 04:00:07.285021 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:07.285003 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxzzc" Apr 21 04:00:07.398266 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:07.397986 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cxzzc"] Apr 21 04:00:07.404396 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:07.404372 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e7fd7b_a1ea_4978_8b92_3e0b1c40ada3.slice/crio-d0013847c68d416f6b8ae7ff1e6c56cc05f034ff1a36b09ab40795e74737d8a2 WatchSource:0}: Error finding container d0013847c68d416f6b8ae7ff1e6c56cc05f034ff1a36b09ab40795e74737d8a2: Status 404 returned error can't find the container with id d0013847c68d416f6b8ae7ff1e6c56cc05f034ff1a36b09ab40795e74737d8a2 Apr 21 04:00:07.939848 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:07.939810 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cxzzc" event={"ID":"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3","Type":"ContainerStarted","Data":"d0013847c68d416f6b8ae7ff1e6c56cc05f034ff1a36b09ab40795e74737d8a2"} Apr 21 04:00:08.024259 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.024189 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9"] Apr 21 04:00:08.027648 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.026899 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.029678 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.029652 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 04:00:08.029886 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.029867 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 04:00:08.032002 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.031978 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-7pgp8\"" Apr 21 04:00:08.032340 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.032314 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 04:00:08.032553 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.032537 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 04:00:08.032732 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.032718 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 04:00:08.037632 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.037615 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9"] Apr 21 04:00:08.040886 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.040645 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4knsl"] Apr 21 04:00:08.045671 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.045380 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.046363 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.046336 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5djqb"] Apr 21 04:00:08.049038 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.048219 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 04:00:08.049038 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.048648 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 04:00:08.049038 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.048852 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-4xctp\"" Apr 21 04:00:08.049260 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.049065 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 04:00:08.053207 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.053190 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.058525 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.056976 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 04:00:08.058525 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.056983 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-d559f\"" Apr 21 04:00:08.058525 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.057369 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 04:00:08.058525 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.057564 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 04:00:08.058525 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.058494 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4knsl"] Apr 21 04:00:08.169606 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169573 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-wtmp\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7f787778-1993-43c5-97a5-e8a841e298b8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169706 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f787778-1993-43c5-97a5-e8a841e298b8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169767 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f787778-1993-43c5-97a5-e8a841e298b8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169793 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae1cda9-6e23-44ac-a8f0-2e07752524d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-tls\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169843 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5trcw\" (UniqueName: \"kubernetes.io/projected/87911e74-ac2f-440d-ae73-34b23d6d9a70-kube-api-access-5trcw\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169880 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk4bc\" (UniqueName: \"kubernetes.io/projected/cae1cda9-6e23-44ac-a8f0-2e07752524d8-kube-api-access-tk4bc\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169906 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87911e74-ac2f-440d-ae73-34b23d6d9a70-metrics-client-ca\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169935 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f787778-1993-43c5-97a5-e8a841e298b8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.169982 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sqtn\" (UniqueName: \"kubernetes.io/projected/7f787778-1993-43c5-97a5-e8a841e298b8-kube-api-access-6sqtn\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.170008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87911e74-ac2f-440d-ae73-34b23d6d9a70-sys\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.170030 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.170031 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cae1cda9-6e23-44ac-a8f0-2e07752524d8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.170678 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.170054 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/87911e74-ac2f-440d-ae73-34b23d6d9a70-root\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.170678 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.170115 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cae1cda9-6e23-44ac-a8f0-2e07752524d8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.170678 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.170170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-accelerators-collector-config\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.170678 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.170254 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7f787778-1993-43c5-97a5-e8a841e298b8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.170678 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.170305 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-textfile\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.271393 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk4bc\" (UniqueName: \"kubernetes.io/projected/cae1cda9-6e23-44ac-a8f0-2e07752524d8-kube-api-access-tk4bc\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.271393 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271379 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87911e74-ac2f-440d-ae73-34b23d6d9a70-metrics-client-ca\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271442 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f787778-1993-43c5-97a5-e8a841e298b8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sqtn\" (UniqueName: \"kubernetes.io/projected/7f787778-1993-43c5-97a5-e8a841e298b8-kube-api-access-6sqtn\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87911e74-ac2f-440d-ae73-34b23d6d9a70-sys\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cae1cda9-6e23-44ac-a8f0-2e07752524d8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/87911e74-ac2f-440d-ae73-34b23d6d9a70-root\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271579 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cae1cda9-6e23-44ac-a8f0-2e07752524d8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-accelerators-collector-config\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271649 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7f787778-1993-43c5-97a5-e8a841e298b8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271697 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-textfile\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271718 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87911e74-ac2f-440d-ae73-34b23d6d9a70-sys\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-wtmp\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.271760 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271764 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7f787778-1993-43c5-97a5-e8a841e298b8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.272437 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f787778-1993-43c5-97a5-e8a841e298b8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.272437 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271852 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f787778-1993-43c5-97a5-e8a841e298b8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.272437 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271876 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae1cda9-6e23-44ac-a8f0-2e07752524d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.272437 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271907 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-tls\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.272437 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.271934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5trcw\" (UniqueName: \"kubernetes.io/projected/87911e74-ac2f-440d-ae73-34b23d6d9a70-kube-api-access-5trcw\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.272437 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.272291 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87911e74-ac2f-440d-ae73-34b23d6d9a70-metrics-client-ca\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.272936 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.272909 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cae1cda9-6e23-44ac-a8f0-2e07752524d8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.273063 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.272981 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/87911e74-ac2f-440d-ae73-34b23d6d9a70-root\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.274005 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.273910 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f787778-1993-43c5-97a5-e8a841e298b8-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.274005 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.273971 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7f787778-1993-43c5-97a5-e8a841e298b8-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.274458 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:00:08.274350 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 04:00:08.274458 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.274410 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-accelerators-collector-config\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.274458 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:00:08.274426 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cae1cda9-6e23-44ac-a8f0-2e07752524d8-openshift-state-metrics-tls podName:cae1cda9-6e23-44ac-a8f0-2e07752524d8 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:08.774405575 +0000 UTC m=+131.941801992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/cae1cda9-6e23-44ac-a8f0-2e07752524d8-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-95tf9" (UID: "cae1cda9-6e23-44ac-a8f0-2e07752524d8") : secret "openshift-state-metrics-tls" not found Apr 21 04:00:08.274951 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.274898 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7f787778-1993-43c5-97a5-e8a841e298b8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.275317 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:00:08.275064 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 04:00:08.275317 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:00:08.275131 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-tls podName:87911e74-ac2f-440d-ae73-34b23d6d9a70 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:08.775113508 +0000 UTC m=+131.942509933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-tls") pod "node-exporter-5djqb" (UID: "87911e74-ac2f-440d-ae73-34b23d6d9a70") : secret "node-exporter-tls" not found Apr 21 04:00:08.275317 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.275167 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-textfile\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.275317 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.275267 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-wtmp\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.277546 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.277526 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.277615 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.277578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cae1cda9-6e23-44ac-a8f0-2e07752524d8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.278116 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.278035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f787778-1993-43c5-97a5-e8a841e298b8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.278581 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.278311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f787778-1993-43c5-97a5-e8a841e298b8-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.281375 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.281351 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5trcw\" (UniqueName: \"kubernetes.io/projected/87911e74-ac2f-440d-ae73-34b23d6d9a70-kube-api-access-5trcw\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.281494 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.281474 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk4bc\" (UniqueName: \"kubernetes.io/projected/cae1cda9-6e23-44ac-a8f0-2e07752524d8-kube-api-access-tk4bc\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.284517 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.284470 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sqtn\" (UniqueName: \"kubernetes.io/projected/7f787778-1993-43c5-97a5-e8a841e298b8-kube-api-access-6sqtn\") pod \"kube-state-metrics-69db897b98-4knsl\" (UID: \"7f787778-1993-43c5-97a5-e8a841e298b8\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.368699 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.368661 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" Apr 21 04:00:08.542428 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.542290 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-4knsl"] Apr 21 04:00:08.545274 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:08.545214 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f787778_1993_43c5_97a5_e8a841e298b8.slice/crio-3f60fbcb17ee1b0801acd4a12c125e9ad0d8de944362da82e17b919a8f8ab3c6 WatchSource:0}: Error finding container 3f60fbcb17ee1b0801acd4a12c125e9ad0d8de944362da82e17b919a8f8ab3c6: Status 404 returned error can't find the container with id 3f60fbcb17ee1b0801acd4a12c125e9ad0d8de944362da82e17b919a8f8ab3c6 Apr 21 04:00:08.776151 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.776048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae1cda9-6e23-44ac-a8f0-2e07752524d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.776151 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.776102 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-tls\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:08.776395 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:00:08.776219 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 04:00:08.776395 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:00:08.776295 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-tls podName:87911e74-ac2f-440d-ae73-34b23d6d9a70 nodeName:}" failed. No retries permitted until 2026-04-21 04:00:09.776278889 +0000 UTC m=+132.943675304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-tls") pod "node-exporter-5djqb" (UID: "87911e74-ac2f-440d-ae73-34b23d6d9a70") : secret "node-exporter-tls" not found Apr 21 04:00:08.778466 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.778443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cae1cda9-6e23-44ac-a8f0-2e07752524d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-95tf9\" (UID: \"cae1cda9-6e23-44ac-a8f0-2e07752524d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.941012 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.940981 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" Apr 21 04:00:08.943915 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.943888 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" event={"ID":"7f787778-1993-43c5-97a5-e8a841e298b8","Type":"ContainerStarted","Data":"3f60fbcb17ee1b0801acd4a12c125e9ad0d8de944362da82e17b919a8f8ab3c6"} Apr 21 04:00:08.945499 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.945477 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cxzzc" event={"ID":"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3","Type":"ContainerStarted","Data":"8d0ef1c1df7f2cbd9ae59df52859cef6eb1c05c3939bd43e17c9e1905228d207"} Apr 21 04:00:08.945499 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.945506 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cxzzc" event={"ID":"03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3","Type":"ContainerStarted","Data":"d369c4f5531bb2ba6fc462c42e5e430fddcece5137ba58c45ae03a315f539b8a"} Apr 21 04:00:08.964517 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:08.964471 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cxzzc" podStartSLOduration=130.94306713 podStartE2EDuration="2m11.964455721s" podCreationTimestamp="2026-04-21 03:57:57 +0000 UTC" firstStartedPulling="2026-04-21 04:00:07.406224318 +0000 UTC m=+130.573620734" lastFinishedPulling="2026-04-21 04:00:08.427612906 +0000 UTC m=+131.595009325" observedRunningTime="2026-04-21 04:00:08.963366055 +0000 UTC m=+132.130762518" watchObservedRunningTime="2026-04-21 04:00:08.964455721 +0000 UTC m=+132.131852202" Apr 21 04:00:09.086156 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.086127 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9"] Apr 21 04:00:09.088041 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:09.088013 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcae1cda9_6e23_44ac_a8f0_2e07752524d8.slice/crio-7b108653a35c35cdefc2eab3000c27ed2940715f73ef960eb851e303c8f46010 WatchSource:0}: Error finding container 7b108653a35c35cdefc2eab3000c27ed2940715f73ef960eb851e303c8f46010: Status 404 returned error can't find the container with id 7b108653a35c35cdefc2eab3000c27ed2940715f73ef960eb851e303c8f46010 Apr 21 04:00:09.786571 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.786538 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-tls\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:09.789676 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.789649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/87911e74-ac2f-440d-ae73-34b23d6d9a70-node-exporter-tls\") pod \"node-exporter-5djqb\" (UID: \"87911e74-ac2f-440d-ae73-34b23d6d9a70\") " pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:09.877140 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.877064 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5djqb" Apr 21 04:00:09.889652 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:09.889623 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87911e74_ac2f_440d_ae73_34b23d6d9a70.slice/crio-f3976707efd1450f322d47b973a99783088930c1f1ee839106e3d9cef11880aa WatchSource:0}: Error finding container f3976707efd1450f322d47b973a99783088930c1f1ee839106e3d9cef11880aa: Status 404 returned error can't find the container with id f3976707efd1450f322d47b973a99783088930c1f1ee839106e3d9cef11880aa Apr 21 04:00:09.949980 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.949886 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5djqb" event={"ID":"87911e74-ac2f-440d-ae73-34b23d6d9a70","Type":"ContainerStarted","Data":"f3976707efd1450f322d47b973a99783088930c1f1ee839106e3d9cef11880aa"} Apr 21 04:00:09.951790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.951760 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" event={"ID":"cae1cda9-6e23-44ac-a8f0-2e07752524d8","Type":"ContainerStarted","Data":"d390b07409288fe28a760c358a023e7adfb72461d97c906d51c3e91ceb00f45d"} Apr 21 04:00:09.951899 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.951799 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" event={"ID":"cae1cda9-6e23-44ac-a8f0-2e07752524d8","Type":"ContainerStarted","Data":"c705d1ad507b0f42ff72c5f63375dc996b20f6b4a00eb24f8d56ac20c758f921"} Apr 21 04:00:09.951899 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.951813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" event={"ID":"cae1cda9-6e23-44ac-a8f0-2e07752524d8","Type":"ContainerStarted","Data":"7b108653a35c35cdefc2eab3000c27ed2940715f73ef960eb851e303c8f46010"} Apr 21 04:00:09.953909 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.953846 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" event={"ID":"7f787778-1993-43c5-97a5-e8a841e298b8","Type":"ContainerStarted","Data":"c2516a5e273d55593068385884195fc105e05d0dde22ac1c40d46e4e7605eeb7"} Apr 21 04:00:09.953909 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.953883 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" event={"ID":"7f787778-1993-43c5-97a5-e8a841e298b8","Type":"ContainerStarted","Data":"7e430442c9c86ea45d71b938c758a8d517a3889ca8a4fa7820b2faa3cd2b3b80"} Apr 21 04:00:09.953909 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.953895 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" event={"ID":"7f787778-1993-43c5-97a5-e8a841e298b8","Type":"ContainerStarted","Data":"5378fc8a90002fafbbbed82ad0bbb8f718efb1b5f148a9126f9cb2384bf1d41b"} Apr 21 04:00:09.972866 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:09.972813 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-4knsl" podStartSLOduration=0.819597316 podStartE2EDuration="1.972793451s" podCreationTimestamp="2026-04-21 04:00:08 +0000 UTC" firstStartedPulling="2026-04-21 04:00:08.547430047 +0000 UTC m=+131.714826472" lastFinishedPulling="2026-04-21 04:00:09.700626191 +0000 UTC m=+132.868022607" observedRunningTime="2026-04-21 04:00:09.971610194 +0000 UTC m=+133.139006648" watchObservedRunningTime="2026-04-21 04:00:09.972793451 +0000 UTC m=+133.140189890" Apr 21 04:00:10.958829 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:10.958753 2569 generic.go:358] "Generic (PLEG): container finished" podID="87911e74-ac2f-440d-ae73-34b23d6d9a70" containerID="20d8a2bffda7c07620d8217af530a254033f7ef436b3b6db761be0096eb54e05" exitCode=0 Apr 21 04:00:10.959296 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:10.958836 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5djqb" event={"ID":"87911e74-ac2f-440d-ae73-34b23d6d9a70","Type":"ContainerDied","Data":"20d8a2bffda7c07620d8217af530a254033f7ef436b3b6db761be0096eb54e05"} Apr 21 04:00:10.960959 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:10.960933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" event={"ID":"cae1cda9-6e23-44ac-a8f0-2e07752524d8","Type":"ContainerStarted","Data":"7abd724a44c283e400629ad64ed8af372478e7788154fd3689d98bcc5e93bfdb"} Apr 21 04:00:10.968005 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:10.967984 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:10.968113 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:10.968020 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:10.969292 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:10.969262 2569 patch_prober.go:28] interesting pod/console-69b444ff5d-wzs8b container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" start-of-body= Apr 21 04:00:10.969396 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:10.969305 2569 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-69b444ff5d-wzs8b" podUID="f80e7119-50ad-410d-9929-88d98ee3357c" containerName="console" probeResult="failure" output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" Apr 21 04:00:10.996111 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:10.996074 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-95tf9" podStartSLOduration=1.9844746340000001 podStartE2EDuration="2.996061834s" podCreationTimestamp="2026-04-21 04:00:08 +0000 UTC" firstStartedPulling="2026-04-21 04:00:09.21981268 +0000 UTC m=+132.387209100" lastFinishedPulling="2026-04-21 04:00:10.231399878 +0000 UTC m=+133.398796300" observedRunningTime="2026-04-21 04:00:10.995116828 +0000 UTC m=+134.162513266" watchObservedRunningTime="2026-04-21 04:00:10.996061834 +0000 UTC m=+134.163458272" Apr 21 04:00:11.966074 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:11.966033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5djqb" event={"ID":"87911e74-ac2f-440d-ae73-34b23d6d9a70","Type":"ContainerStarted","Data":"a1d0d90429b10928b8c81fe51211a66f828a63cb9f6d71cf76dd8b481e9805e0"} Apr 21 04:00:11.966074 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:11.966073 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5djqb" event={"ID":"87911e74-ac2f-440d-ae73-34b23d6d9a70","Type":"ContainerStarted","Data":"1f0661d8196583294c4b3a3a7db8d5e42cdd45c0dad5b56a017bf8154eb727c2"} Apr 21 04:00:11.987779 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:11.987729 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5djqb" podStartSLOduration=3.194370561 podStartE2EDuration="3.987716201s" podCreationTimestamp="2026-04-21 04:00:08 +0000 UTC" firstStartedPulling="2026-04-21 04:00:09.892901201 +0000 UTC m=+133.060297623" lastFinishedPulling="2026-04-21 04:00:10.686246834 +0000 UTC m=+133.853643263" observedRunningTime="2026-04-21 04:00:11.986492907 +0000 UTC m=+135.153889345" watchObservedRunningTime="2026-04-21 04:00:11.987716201 +0000 UTC m=+135.155112690" Apr 21 04:00:14.235848 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.235815 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:00:14.238707 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.238692 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.241016 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.240994 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 04:00:14.241151 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.241069 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 04:00:14.241151 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.241088 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mwmvv\"" Apr 21 04:00:14.241151 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.241106 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 04:00:14.241355 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.241161 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 04:00:14.241355 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.241177 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 04:00:14.241458 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.241421 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 04:00:14.241747 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.241719 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 04:00:14.241747 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.241727 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 04:00:14.241890 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.241753 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 04:00:14.242095 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.242078 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 04:00:14.242179 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.242131 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 04:00:14.242179 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.242146 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bcp5828antn9u\"" Apr 21 04:00:14.242310 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.242279 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 04:00:14.244551 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.244533 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 04:00:14.252532 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.252511 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:00:14.329350 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329319 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329511 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329511 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329382 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329511 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329399 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329511 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329422 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329511 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329466 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329677 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329677 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329677 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329581 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-web-config\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329677 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329677 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329625 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329677 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329648 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329677 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config-out\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329923 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329923 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329728 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srqpk\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-kube-api-access-srqpk\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329923 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329747 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329923 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.329923 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.329793 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430391 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430351 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430558 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430558 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430558 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430736 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430736 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430664 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430736 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430736 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430715 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-web-config\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430756 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430816 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430843 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config-out\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.430931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.430887 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.431170 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.431069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.432792 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.431293 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srqpk\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-kube-api-access-srqpk\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.432792 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.431348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.432792 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.431389 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.432792 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.431416 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.432792 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.431496 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.432792 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.432083 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.432792 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.432204 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.434058 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.434025 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.434277 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.434233 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.434697 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.434655 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.434892 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.434874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-web-config\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.435271 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.435094 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.435379 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.435320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.435379 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.435320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.435857 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.435830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.435991 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.435972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.436409 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.436388 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.436738 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.436711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.437019 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.437001 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.437131 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.437113 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.438332 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.438314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config-out\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.440189 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.440167 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srqpk\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-kube-api-access-srqpk\") pod \"prometheus-k8s-0\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.548663 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.548580 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:14.674128 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.674092 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:00:14.677699 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:14.677666 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e0506cf_fd9d_4769_a7c5_c90ce5ae71b9.slice/crio-55701a0ba1709ebb3671f379388dc6f58f2ce46dd3f8f1690b1a45142724acca WatchSource:0}: Error finding container 55701a0ba1709ebb3671f379388dc6f58f2ce46dd3f8f1690b1a45142724acca: Status 404 returned error can't find the container with id 55701a0ba1709ebb3671f379388dc6f58f2ce46dd3f8f1690b1a45142724acca Apr 21 04:00:14.975954 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:14.975924 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerStarted","Data":"55701a0ba1709ebb3671f379388dc6f58f2ce46dd3f8f1690b1a45142724acca"} Apr 21 04:00:15.980979 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:15.980944 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerID="6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d" exitCode=0 Apr 21 04:00:15.981462 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:15.980992 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerDied","Data":"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d"} Apr 21 04:00:18.243264 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:18.243204 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b444ff5d-wzs8b"] Apr 21 04:00:18.994316 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:18.994278 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerStarted","Data":"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652"} Apr 21 04:00:18.994316 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:18.994324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerStarted","Data":"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3"} Apr 21 04:00:21.002758 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:21.002723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerStarted","Data":"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352"} Apr 21 04:00:21.002758 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:21.002759 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerStarted","Data":"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7"} Apr 21 04:00:21.003167 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:21.002768 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerStarted","Data":"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e"} Apr 21 04:00:21.003167 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:21.002776 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerStarted","Data":"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791"} Apr 21 04:00:21.029909 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:21.029856 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.6159628160000001 podStartE2EDuration="7.029839792s" podCreationTimestamp="2026-04-21 04:00:14 +0000 UTC" firstStartedPulling="2026-04-21 04:00:14.679733731 +0000 UTC m=+137.847130158" lastFinishedPulling="2026-04-21 04:00:20.093610718 +0000 UTC m=+143.261007134" observedRunningTime="2026-04-21 04:00:21.028499985 +0000 UTC m=+144.195896423" watchObservedRunningTime="2026-04-21 04:00:21.029839792 +0000 UTC m=+144.197236244" Apr 21 04:00:21.921745 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:21.921717 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7c98bb8c4b-66m2p" Apr 21 04:00:24.549738 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:24.549706 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:00:33.663688 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:00:33.663652 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p2sxx" podUID="582f3951-c97b-47a0-9cb1-39d85f78b692" Apr 21 04:00:33.668756 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:00:33.668726 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-xmfjq" podUID="4568e112-4a47-4610-a776-ed8ac69d2ca9" Apr 21 04:00:34.040093 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:34.040022 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 04:00:34.040093 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:34.040037 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p2sxx" Apr 21 04:00:38.545762 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:38.545726 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 04:00:38.546126 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:38.545770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 04:00:38.548414 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:38.548387 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/582f3951-c97b-47a0-9cb1-39d85f78b692-metrics-tls\") pod \"dns-default-p2sxx\" (UID: \"582f3951-c97b-47a0-9cb1-39d85f78b692\") " pod="openshift-dns/dns-default-p2sxx" Apr 21 04:00:38.548519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:38.548439 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4568e112-4a47-4610-a776-ed8ac69d2ca9-cert\") pod \"ingress-canary-xmfjq\" (UID: \"4568e112-4a47-4610-a776-ed8ac69d2ca9\") " pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 04:00:38.843964 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:38.843935 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7tx7c\"" Apr 21 04:00:38.844161 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:38.843935 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7mknc\"" Apr 21 04:00:38.851355 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:38.851337 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmfjq" Apr 21 04:00:38.851442 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:38.851423 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p2sxx" Apr 21 04:00:38.996562 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:38.996522 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmfjq"] Apr 21 04:00:38.999543 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:38.999495 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4568e112_4a47_4610_a776_ed8ac69d2ca9.slice/crio-4a5054418880fd23d55a73925bfd3bc01a123d3794d038fb2fcebaecc4c598e1 WatchSource:0}: Error finding container 4a5054418880fd23d55a73925bfd3bc01a123d3794d038fb2fcebaecc4c598e1: Status 404 returned error can't find the container with id 4a5054418880fd23d55a73925bfd3bc01a123d3794d038fb2fcebaecc4c598e1 Apr 21 04:00:39.013739 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:39.013716 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p2sxx"] Apr 21 04:00:39.016923 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:00:39.016899 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod582f3951_c97b_47a0_9cb1_39d85f78b692.slice/crio-9327ed1d9136b249bd0c162187ae3f8f64d6997b8301b52e922dc795a59e1e9c WatchSource:0}: Error finding container 9327ed1d9136b249bd0c162187ae3f8f64d6997b8301b52e922dc795a59e1e9c: Status 404 returned error can't find the container with id 9327ed1d9136b249bd0c162187ae3f8f64d6997b8301b52e922dc795a59e1e9c Apr 21 04:00:39.054186 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:39.054156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p2sxx" event={"ID":"582f3951-c97b-47a0-9cb1-39d85f78b692","Type":"ContainerStarted","Data":"9327ed1d9136b249bd0c162187ae3f8f64d6997b8301b52e922dc795a59e1e9c"} Apr 21 04:00:39.055170 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:39.055147 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmfjq" event={"ID":"4568e112-4a47-4610-a776-ed8ac69d2ca9","Type":"ContainerStarted","Data":"4a5054418880fd23d55a73925bfd3bc01a123d3794d038fb2fcebaecc4c598e1"} Apr 21 04:00:41.071305 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:41.071230 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p2sxx" event={"ID":"582f3951-c97b-47a0-9cb1-39d85f78b692","Type":"ContainerStarted","Data":"23ba98cef120df49d56573a263ae4856da67fb8aca40df2fcbc6ca3397c85b92"} Apr 21 04:00:41.072852 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:41.072826 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmfjq" event={"ID":"4568e112-4a47-4610-a776-ed8ac69d2ca9","Type":"ContainerStarted","Data":"5f4150d1d6cef959ef3ec26e4b834cfdca6b44295c196dd486d5ab7abbaabcb1"} Apr 21 04:00:41.090098 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:41.090045 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xmfjq" podStartSLOduration=129.14840956 podStartE2EDuration="2m11.09001749s" podCreationTimestamp="2026-04-21 03:58:30 +0000 UTC" firstStartedPulling="2026-04-21 04:00:39.00142607 +0000 UTC m=+162.168822486" lastFinishedPulling="2026-04-21 04:00:40.943033998 +0000 UTC m=+164.110430416" observedRunningTime="2026-04-21 04:00:41.089427568 +0000 UTC m=+164.256824011" watchObservedRunningTime="2026-04-21 04:00:41.09001749 +0000 UTC m=+164.257413929" Apr 21 04:00:42.077924 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:42.077884 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p2sxx" event={"ID":"582f3951-c97b-47a0-9cb1-39d85f78b692","Type":"ContainerStarted","Data":"571dd0ae0c3addb31ae00c9066b94aac27941b089b86f7eb22447cb6ec5e4948"} Apr 21 04:00:42.094684 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:42.094634 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p2sxx" podStartSLOduration=130.17486698 podStartE2EDuration="2m12.094618875s" podCreationTimestamp="2026-04-21 03:58:30 +0000 UTC" firstStartedPulling="2026-04-21 04:00:39.018782422 +0000 UTC m=+162.186178837" lastFinishedPulling="2026-04-21 04:00:40.9385343 +0000 UTC m=+164.105930732" observedRunningTime="2026-04-21 04:00:42.092850977 +0000 UTC m=+165.260247415" watchObservedRunningTime="2026-04-21 04:00:42.094618875 +0000 UTC m=+165.262015312" Apr 21 04:00:43.081212 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.081185 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p2sxx" Apr 21 04:00:43.270496 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.270441 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69b444ff5d-wzs8b" podUID="f80e7119-50ad-410d-9929-88d98ee3357c" containerName="console" containerID="cri-o://4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d" gracePeriod=15 Apr 21 04:00:43.506565 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.506540 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b444ff5d-wzs8b_f80e7119-50ad-410d-9929-88d98ee3357c/console/0.log" Apr 21 04:00:43.506671 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.506609 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:43.590021 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.589983 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-oauth-serving-cert\") pod \"f80e7119-50ad-410d-9929-88d98ee3357c\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " Apr 21 04:00:43.590188 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.590048 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-service-ca\") pod \"f80e7119-50ad-410d-9929-88d98ee3357c\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " Apr 21 04:00:43.590188 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.590101 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-serving-cert\") pod \"f80e7119-50ad-410d-9929-88d98ee3357c\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " Apr 21 04:00:43.590188 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.590144 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czxrq\" (UniqueName: \"kubernetes.io/projected/f80e7119-50ad-410d-9929-88d98ee3357c-kube-api-access-czxrq\") pod \"f80e7119-50ad-410d-9929-88d98ee3357c\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " Apr 21 04:00:43.590188 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.590172 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-oauth-config\") pod \"f80e7119-50ad-410d-9929-88d98ee3357c\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " Apr 21 04:00:43.590427 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.590203 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-console-config\") pod \"f80e7119-50ad-410d-9929-88d98ee3357c\" (UID: \"f80e7119-50ad-410d-9929-88d98ee3357c\") " Apr 21 04:00:43.590590 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.590470 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-service-ca" (OuterVolumeSpecName: "service-ca") pod "f80e7119-50ad-410d-9929-88d98ee3357c" (UID: "f80e7119-50ad-410d-9929-88d98ee3357c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:43.590744 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.590589 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f80e7119-50ad-410d-9929-88d98ee3357c" (UID: "f80e7119-50ad-410d-9929-88d98ee3357c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:43.590840 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.590747 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-console-config" (OuterVolumeSpecName: "console-config") pod "f80e7119-50ad-410d-9929-88d98ee3357c" (UID: "f80e7119-50ad-410d-9929-88d98ee3357c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:00:43.592646 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.592620 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f80e7119-50ad-410d-9929-88d98ee3357c" (UID: "f80e7119-50ad-410d-9929-88d98ee3357c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:43.593027 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.593005 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80e7119-50ad-410d-9929-88d98ee3357c-kube-api-access-czxrq" (OuterVolumeSpecName: "kube-api-access-czxrq") pod "f80e7119-50ad-410d-9929-88d98ee3357c" (UID: "f80e7119-50ad-410d-9929-88d98ee3357c"). InnerVolumeSpecName "kube-api-access-czxrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:00:43.593081 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.593006 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f80e7119-50ad-410d-9929-88d98ee3357c" (UID: "f80e7119-50ad-410d-9929-88d98ee3357c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:00:43.691664 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.691587 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-serving-cert\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:00:43.691664 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.691612 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-czxrq\" (UniqueName: \"kubernetes.io/projected/f80e7119-50ad-410d-9929-88d98ee3357c-kube-api-access-czxrq\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:00:43.691664 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.691624 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f80e7119-50ad-410d-9929-88d98ee3357c-console-oauth-config\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:00:43.691664 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.691632 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-console-config\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:00:43.691664 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.691641 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-oauth-serving-cert\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:00:43.691664 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:43.691650 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f80e7119-50ad-410d-9929-88d98ee3357c-service-ca\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:00:44.085869 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.085786 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69b444ff5d-wzs8b_f80e7119-50ad-410d-9929-88d98ee3357c/console/0.log" Apr 21 04:00:44.085869 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.085824 2569 generic.go:358] "Generic (PLEG): container finished" podID="f80e7119-50ad-410d-9929-88d98ee3357c" containerID="4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d" exitCode=2 Apr 21 04:00:44.086366 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.085854 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b444ff5d-wzs8b" event={"ID":"f80e7119-50ad-410d-9929-88d98ee3357c","Type":"ContainerDied","Data":"4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d"} Apr 21 04:00:44.086366 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.085899 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69b444ff5d-wzs8b" Apr 21 04:00:44.086366 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.085913 2569 scope.go:117] "RemoveContainer" containerID="4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d" Apr 21 04:00:44.086366 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.085901 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69b444ff5d-wzs8b" event={"ID":"f80e7119-50ad-410d-9929-88d98ee3357c","Type":"ContainerDied","Data":"3b5f40bcd9a53430f0d8dee4f5cebabeec091392b273bd5e7c947c651bc1357f"} Apr 21 04:00:44.094288 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.094263 2569 scope.go:117] "RemoveContainer" containerID="4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d" Apr 21 04:00:44.094541 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:00:44.094520 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d\": container with ID starting with 4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d not found: ID does not exist" containerID="4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d" Apr 21 04:00:44.094587 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.094551 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d"} err="failed to get container status \"4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d\": rpc error: code = NotFound desc = could not find container \"4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d\": container with ID starting with 4961ec63466bb54ed7d7b4cecc3b99419bef76072b8119c42e306b882079b70d not found: ID does not exist" Apr 21 04:00:44.108410 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.108386 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69b444ff5d-wzs8b"] Apr 21 04:00:44.113266 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:44.113230 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69b444ff5d-wzs8b"] Apr 21 04:00:45.469934 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:45.469900 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80e7119-50ad-410d-9929-88d98ee3357c" path="/var/lib/kubelet/pods/f80e7119-50ad-410d-9929-88d98ee3357c/volumes" Apr 21 04:00:47.098930 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:47.098896 2569 generic.go:358] "Generic (PLEG): container finished" podID="15ac9e93-0c06-48fe-bfdb-ba6d11fedd31" containerID="e0b1b6ce29b542201494b99ad3a3593ec1397af60db39b5960340f6e0409a656" exitCode=0 Apr 21 04:00:47.099336 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:47.098974 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" event={"ID":"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31","Type":"ContainerDied","Data":"e0b1b6ce29b542201494b99ad3a3593ec1397af60db39b5960340f6e0409a656"} Apr 21 04:00:47.099336 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:47.099320 2569 scope.go:117] "RemoveContainer" containerID="e0b1b6ce29b542201494b99ad3a3593ec1397af60db39b5960340f6e0409a656" Apr 21 04:00:48.103762 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:48.103727 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-h8mkr" event={"ID":"15ac9e93-0c06-48fe-bfdb-ba6d11fedd31","Type":"ContainerStarted","Data":"27f63a6f2f05413c4c05c4509e78ad2c92b2b41e7afb08e2f900ea651c844ed1"} Apr 21 04:00:52.116531 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:52.116492 2569 generic.go:358] "Generic (PLEG): container finished" podID="2fea1f5b-33a1-4b14-a6f9-dc99d97673d2" containerID="d2a12f4e1786f63e48c7ab4120267a74fba0e4903b351ab6bae40b8a2491ab6e" exitCode=0 Apr 21 04:00:52.116895 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:52.116563 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-657t5" event={"ID":"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2","Type":"ContainerDied","Data":"d2a12f4e1786f63e48c7ab4120267a74fba0e4903b351ab6bae40b8a2491ab6e"} Apr 21 04:00:52.116954 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:52.116901 2569 scope.go:117] "RemoveContainer" containerID="d2a12f4e1786f63e48c7ab4120267a74fba0e4903b351ab6bae40b8a2491ab6e" Apr 21 04:00:53.088127 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:53.088101 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p2sxx" Apr 21 04:00:53.121308 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:00:53.121279 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-657t5" event={"ID":"2fea1f5b-33a1-4b14-a6f9-dc99d97673d2","Type":"ContainerStarted","Data":"ea434badef783a1a4e3f8f4ac980706f9222029eb9160b10be2ce99b0f3c21ef"} Apr 21 04:01:14.549741 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:14.549707 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:14.572297 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:14.572272 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:15.207725 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:15.207699 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:32.594595 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.594558 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:01:32.595071 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.594970 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="prometheus" containerID="cri-o://e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" gracePeriod=600 Apr 21 04:01:32.595071 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.595006 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy" containerID="cri-o://638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" gracePeriod=600 Apr 21 04:01:32.595071 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.595039 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="thanos-sidecar" containerID="cri-o://efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" gracePeriod=600 Apr 21 04:01:32.595267 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.595083 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="config-reloader" containerID="cri-o://9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" gracePeriod=600 Apr 21 04:01:32.595267 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.595111 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy-web" containerID="cri-o://29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" gracePeriod=600 Apr 21 04:01:32.595267 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.595042 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy-thanos" containerID="cri-o://81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" gracePeriod=600 Apr 21 04:01:32.839015 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.838993 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:32.893751 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893674 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-trusted-ca-bundle\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.893751 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893713 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-db\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.893751 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893739 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894000 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893760 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-metrics-client-ca\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894000 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893785 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-web-config\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894000 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893824 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-tls\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894000 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893856 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-kube-rbac-proxy\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894000 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893882 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config-out\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894000 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893910 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894000 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893944 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894000 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.893975 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-kubelet-serving-ca-bundle\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894434 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894005 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-serving-certs-ca-bundle\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894434 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894042 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-grpc-tls\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894434 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894070 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-thanos-prometheus-http-client-file\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894434 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894117 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-rulefiles-0\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894434 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894141 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:32.894434 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894164 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-tls-assets\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894434 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894192 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-metrics-client-certs\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894434 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894224 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srqpk\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-kube-api-access-srqpk\") pod \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\" (UID: \"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9\") " Apr 21 04:01:32.894824 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894498 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.894824 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894521 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:32.894824 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.894810 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:32.895228 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.895201 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:32.895337 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.895310 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:01:32.898069 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.898007 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:01:32.898567 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.898524 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:32.898711 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.898686 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-kube-api-access-srqpk" (OuterVolumeSpecName: "kube-api-access-srqpk") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "kube-api-access-srqpk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:01:32.899019 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.898827 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:32.899019 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.898896 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config-out" (OuterVolumeSpecName: "config-out") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:01:32.899019 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.898940 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config" (OuterVolumeSpecName: "config") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:32.899250 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.899151 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:32.899595 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.899570 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:32.899696 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.899586 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:32.899862 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.899844 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:01:32.900763 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.900734 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:32.901319 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.901296 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:32.909141 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.909108 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-web-config" (OuterVolumeSpecName: "web-config") pod "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" (UID: "5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:01:32.995626 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995597 2569 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-kube-rbac-proxy\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995626 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995623 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config-out\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995635 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995645 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995654 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995664 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995673 2569 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-grpc-tls\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995681 2569 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995691 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995699 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-tls-assets\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995707 2569 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-metrics-client-certs\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995715 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-srqpk\" (UniqueName: \"kubernetes.io/projected/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-kube-api-access-srqpk\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995723 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-prometheus-k8s-db\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995733 2569 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-config\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995741 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-configmap-metrics-client-ca\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995750 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-web-config\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:32.995790 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:32.995758 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:01:33.248931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248848 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerID="81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" exitCode=0 Apr 21 04:01:33.248931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248876 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerID="638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" exitCode=0 Apr 21 04:01:33.248931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248882 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerID="29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" exitCode=0 Apr 21 04:01:33.248931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248888 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerID="efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" exitCode=0 Apr 21 04:01:33.248931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248894 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerID="9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" exitCode=0 Apr 21 04:01:33.248931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248900 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerID="e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" exitCode=0 Apr 21 04:01:33.248931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248919 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerDied","Data":"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352"} Apr 21 04:01:33.249350 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248956 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerDied","Data":"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7"} Apr 21 04:01:33.249350 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248971 2569 scope.go:117] "RemoveContainer" containerID="81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" Apr 21 04:01:33.249350 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248972 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerDied","Data":"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e"} Apr 21 04:01:33.249350 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.249052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerDied","Data":"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791"} Apr 21 04:01:33.249350 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.249066 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerDied","Data":"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652"} Apr 21 04:01:33.249350 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.249082 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerDied","Data":"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3"} Apr 21 04:01:33.249350 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.249091 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9","Type":"ContainerDied","Data":"55701a0ba1709ebb3671f379388dc6f58f2ce46dd3f8f1690b1a45142724acca"} Apr 21 04:01:33.249350 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.248958 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.258117 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.258100 2569 scope.go:117] "RemoveContainer" containerID="638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" Apr 21 04:01:33.264842 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.264825 2569 scope.go:117] "RemoveContainer" containerID="29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" Apr 21 04:01:33.271687 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.271668 2569 scope.go:117] "RemoveContainer" containerID="efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" Apr 21 04:01:33.272171 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.272139 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:01:33.276081 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.276060 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:01:33.278698 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.278682 2569 scope.go:117] "RemoveContainer" containerID="9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" Apr 21 04:01:33.285191 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.285172 2569 scope.go:117] "RemoveContainer" containerID="e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" Apr 21 04:01:33.292228 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.292210 2569 scope.go:117] "RemoveContainer" containerID="6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d" Apr 21 04:01:33.299223 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.299200 2569 scope.go:117] "RemoveContainer" containerID="81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" Apr 21 04:01:33.299595 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:01:33.299576 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": container with ID starting with 81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352 not found: ID does not exist" containerID="81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" Apr 21 04:01:33.299668 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.299601 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352"} err="failed to get container status \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": rpc error: code = NotFound desc = could not find container \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": container with ID starting with 81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352 not found: ID does not exist" Apr 21 04:01:33.299668 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.299622 2569 scope.go:117] "RemoveContainer" containerID="638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" Apr 21 04:01:33.299755 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.299706 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:01:33.299908 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:01:33.299882 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": container with ID starting with 638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7 not found: ID does not exist" containerID="638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" Apr 21 04:01:33.299955 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.299913 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7"} err="failed to get container status \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": rpc error: code = NotFound desc = could not find container \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": container with ID starting with 638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7 not found: ID does not exist" Apr 21 04:01:33.299955 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.299931 2569 scope.go:117] "RemoveContainer" containerID="29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" Apr 21 04:01:33.300121 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300104 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="thanos-sidecar" Apr 21 04:01:33.300179 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300125 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="thanos-sidecar" Apr 21 04:01:33.300179 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300154 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy-web" Apr 21 04:01:33.300179 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300163 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy-web" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300182 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy-thanos" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300192 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy-thanos" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300205 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="config-reloader" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300213 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="config-reloader" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300229 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="init-config-reloader" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300269 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="init-config-reloader" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300283 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300292 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300304 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f80e7119-50ad-410d-9929-88d98ee3357c" containerName="console" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300312 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e7119-50ad-410d-9929-88d98ee3357c" containerName="console" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300323 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="prometheus" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300331 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="prometheus" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:01:33.300160 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": container with ID starting with 29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e not found: ID does not exist" containerID="29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300405 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f80e7119-50ad-410d-9929-88d98ee3357c" containerName="console" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300418 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="thanos-sidecar" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300435 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="prometheus" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300446 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy-web" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300455 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300464 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="kube-rbac-proxy-thanos" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300474 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" containerName="config-reloader" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300414 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e"} err="failed to get container status \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": rpc error: code = NotFound desc = could not find container \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": container with ID starting with 29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e not found: ID does not exist" Apr 21 04:01:33.300519 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300503 2569 scope.go:117] "RemoveContainer" containerID="efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" Apr 21 04:01:33.301209 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:01:33.300744 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": container with ID starting with efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791 not found: ID does not exist" containerID="efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" Apr 21 04:01:33.301209 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300758 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791"} err="failed to get container status \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": rpc error: code = NotFound desc = could not find container \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": container with ID starting with efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791 not found: ID does not exist" Apr 21 04:01:33.301209 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300769 2569 scope.go:117] "RemoveContainer" containerID="9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" Apr 21 04:01:33.301209 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:01:33.300949 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": container with ID starting with 9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652 not found: ID does not exist" containerID="9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" Apr 21 04:01:33.301209 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300964 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652"} err="failed to get container status \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": rpc error: code = NotFound desc = could not find container \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": container with ID starting with 9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652 not found: ID does not exist" Apr 21 04:01:33.301209 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.300978 2569 scope.go:117] "RemoveContainer" containerID="e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" Apr 21 04:01:33.301209 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:01:33.301192 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": container with ID starting with e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3 not found: ID does not exist" containerID="e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" Apr 21 04:01:33.301472 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.301210 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3"} err="failed to get container status \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": rpc error: code = NotFound desc = could not find container \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": container with ID starting with e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3 not found: ID does not exist" Apr 21 04:01:33.301472 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.301225 2569 scope.go:117] "RemoveContainer" containerID="6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d" Apr 21 04:01:33.301472 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:01:33.301432 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": container with ID starting with 6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d not found: ID does not exist" containerID="6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d" Apr 21 04:01:33.301472 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.301449 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d"} err="failed to get container status \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": rpc error: code = NotFound desc = could not find container \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": container with ID starting with 6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d not found: ID does not exist" Apr 21 04:01:33.301472 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.301462 2569 scope.go:117] "RemoveContainer" containerID="81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" Apr 21 04:01:33.301654 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.301624 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352"} err="failed to get container status \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": rpc error: code = NotFound desc = could not find container \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": container with ID starting with 81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352 not found: ID does not exist" Apr 21 04:01:33.301654 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.301636 2569 scope.go:117] "RemoveContainer" containerID="638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" Apr 21 04:01:33.301888 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.301852 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7"} err="failed to get container status \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": rpc error: code = NotFound desc = could not find container \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": container with ID starting with 638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7 not found: ID does not exist" Apr 21 04:01:33.301888 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.301882 2569 scope.go:117] "RemoveContainer" containerID="29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" Apr 21 04:01:33.302127 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.302109 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e"} err="failed to get container status \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": rpc error: code = NotFound desc = could not find container \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": container with ID starting with 29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e not found: ID does not exist" Apr 21 04:01:33.302176 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.302127 2569 scope.go:117] "RemoveContainer" containerID="efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" Apr 21 04:01:33.302405 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.302384 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791"} err="failed to get container status \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": rpc error: code = NotFound desc = could not find container \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": container with ID starting with efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791 not found: ID does not exist" Apr 21 04:01:33.302405 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.302404 2569 scope.go:117] "RemoveContainer" containerID="9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" Apr 21 04:01:33.302635 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.302619 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652"} err="failed to get container status \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": rpc error: code = NotFound desc = could not find container \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": container with ID starting with 9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652 not found: ID does not exist" Apr 21 04:01:33.302702 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.302637 2569 scope.go:117] "RemoveContainer" containerID="e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" Apr 21 04:01:33.302875 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.302857 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3"} err="failed to get container status \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": rpc error: code = NotFound desc = could not find container \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": container with ID starting with e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3 not found: ID does not exist" Apr 21 04:01:33.302936 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.302876 2569 scope.go:117] "RemoveContainer" containerID="6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d" Apr 21 04:01:33.303091 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303071 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d"} err="failed to get container status \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": rpc error: code = NotFound desc = could not find container \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": container with ID starting with 6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d not found: ID does not exist" Apr 21 04:01:33.303157 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303092 2569 scope.go:117] "RemoveContainer" containerID="81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" Apr 21 04:01:33.303329 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303310 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352"} err="failed to get container status \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": rpc error: code = NotFound desc = could not find container \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": container with ID starting with 81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352 not found: ID does not exist" Apr 21 04:01:33.303402 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303330 2569 scope.go:117] "RemoveContainer" containerID="638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" Apr 21 04:01:33.303561 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303545 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7"} err="failed to get container status \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": rpc error: code = NotFound desc = could not find container \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": container with ID starting with 638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7 not found: ID does not exist" Apr 21 04:01:33.303625 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303562 2569 scope.go:117] "RemoveContainer" containerID="29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" Apr 21 04:01:33.303751 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303735 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e"} err="failed to get container status \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": rpc error: code = NotFound desc = could not find container \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": container with ID starting with 29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e not found: ID does not exist" Apr 21 04:01:33.303817 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303753 2569 scope.go:117] "RemoveContainer" containerID="efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" Apr 21 04:01:33.303958 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303935 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791"} err="failed to get container status \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": rpc error: code = NotFound desc = could not find container \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": container with ID starting with efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791 not found: ID does not exist" Apr 21 04:01:33.304002 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.303960 2569 scope.go:117] "RemoveContainer" containerID="9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" Apr 21 04:01:33.304137 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.304120 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652"} err="failed to get container status \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": rpc error: code = NotFound desc = could not find container \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": container with ID starting with 9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652 not found: ID does not exist" Apr 21 04:01:33.304185 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.304139 2569 scope.go:117] "RemoveContainer" containerID="e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" Apr 21 04:01:33.304392 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.304375 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3"} err="failed to get container status \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": rpc error: code = NotFound desc = could not find container \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": container with ID starting with e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3 not found: ID does not exist" Apr 21 04:01:33.304456 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.304396 2569 scope.go:117] "RemoveContainer" containerID="6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d" Apr 21 04:01:33.304626 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.304609 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d"} err="failed to get container status \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": rpc error: code = NotFound desc = could not find container \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": container with ID starting with 6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d not found: ID does not exist" Apr 21 04:01:33.304684 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.304627 2569 scope.go:117] "RemoveContainer" containerID="81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" Apr 21 04:01:33.304827 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.304806 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352"} err="failed to get container status \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": rpc error: code = NotFound desc = could not find container \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": container with ID starting with 81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352 not found: ID does not exist" Apr 21 04:01:33.304894 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.304829 2569 scope.go:117] "RemoveContainer" containerID="638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" Apr 21 04:01:33.305050 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.305031 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7"} err="failed to get container status \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": rpc error: code = NotFound desc = could not find container \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": container with ID starting with 638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7 not found: ID does not exist" Apr 21 04:01:33.305050 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.305049 2569 scope.go:117] "RemoveContainer" containerID="29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" Apr 21 04:01:33.305198 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.305099 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.305310 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.305282 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e"} err="failed to get container status \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": rpc error: code = NotFound desc = could not find container \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": container with ID starting with 29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e not found: ID does not exist" Apr 21 04:01:33.305310 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.305309 2569 scope.go:117] "RemoveContainer" containerID="efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" Apr 21 04:01:33.305734 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.305709 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791"} err="failed to get container status \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": rpc error: code = NotFound desc = could not find container \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": container with ID starting with efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791 not found: ID does not exist" Apr 21 04:01:33.305734 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.305734 2569 scope.go:117] "RemoveContainer" containerID="9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" Apr 21 04:01:33.305982 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.305960 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652"} err="failed to get container status \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": rpc error: code = NotFound desc = could not find container \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": container with ID starting with 9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652 not found: ID does not exist" Apr 21 04:01:33.306053 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.305983 2569 scope.go:117] "RemoveContainer" containerID="e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" Apr 21 04:01:33.306205 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.306186 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3"} err="failed to get container status \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": rpc error: code = NotFound desc = could not find container \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": container with ID starting with e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3 not found: ID does not exist" Apr 21 04:01:33.306314 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.306206 2569 scope.go:117] "RemoveContainer" containerID="6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d" Apr 21 04:01:33.306466 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.306446 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d"} err="failed to get container status \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": rpc error: code = NotFound desc = could not find container \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": container with ID starting with 6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d not found: ID does not exist" Apr 21 04:01:33.306518 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.306467 2569 scope.go:117] "RemoveContainer" containerID="81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" Apr 21 04:01:33.306708 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.306684 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352"} err="failed to get container status \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": rpc error: code = NotFound desc = could not find container \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": container with ID starting with 81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352 not found: ID does not exist" Apr 21 04:01:33.306774 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.306710 2569 scope.go:117] "RemoveContainer" containerID="638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" Apr 21 04:01:33.306936 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.306919 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7"} err="failed to get container status \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": rpc error: code = NotFound desc = could not find container \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": container with ID starting with 638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7 not found: ID does not exist" Apr 21 04:01:33.306995 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.306939 2569 scope.go:117] "RemoveContainer" containerID="29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" Apr 21 04:01:33.307193 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307167 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e"} err="failed to get container status \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": rpc error: code = NotFound desc = could not find container \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": container with ID starting with 29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e not found: ID does not exist" Apr 21 04:01:33.307257 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307196 2569 scope.go:117] "RemoveContainer" containerID="efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" Apr 21 04:01:33.307537 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307507 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791"} err="failed to get container status \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": rpc error: code = NotFound desc = could not find container \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": container with ID starting with efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791 not found: ID does not exist" Apr 21 04:01:33.307537 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307536 2569 scope.go:117] "RemoveContainer" containerID="9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" Apr 21 04:01:33.307682 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307619 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 04:01:33.307988 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307778 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652"} err="failed to get container status \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": rpc error: code = NotFound desc = could not find container \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": container with ID starting with 9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652 not found: ID does not exist" Apr 21 04:01:33.307988 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307801 2569 scope.go:117] "RemoveContainer" containerID="e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" Apr 21 04:01:33.307988 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307855 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 04:01:33.307988 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307865 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 04:01:33.307988 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307859 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 04:01:33.307988 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307895 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 04:01:33.307988 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307934 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bcp5828antn9u\"" Apr 21 04:01:33.307988 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.307990 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 04:01:33.308376 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.308339 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3"} err="failed to get container status \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": rpc error: code = NotFound desc = could not find container \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": container with ID starting with e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3 not found: ID does not exist" Apr 21 04:01:33.308442 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.308378 2569 scope.go:117] "RemoveContainer" containerID="6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d" Apr 21 04:01:33.308442 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.308420 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 04:01:33.308667 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.308618 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mwmvv\"" Apr 21 04:01:33.308667 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.308637 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 04:01:33.308667 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.308642 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 04:01:33.309134 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.308690 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 04:01:33.309134 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.308810 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d"} err="failed to get container status \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": rpc error: code = NotFound desc = could not find container \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": container with ID starting with 6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d not found: ID does not exist" Apr 21 04:01:33.309134 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.308834 2569 scope.go:117] "RemoveContainer" containerID="81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352" Apr 21 04:01:33.311994 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.309900 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 04:01:33.311994 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.310063 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352"} err="failed to get container status \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": rpc error: code = NotFound desc = could not find container \"81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352\": container with ID starting with 81d07f1b04ada175b07e9dcee15ab981c158b99aeda36e11e8d41ebe67b58352 not found: ID does not exist" Apr 21 04:01:33.311994 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.310086 2569 scope.go:117] "RemoveContainer" containerID="638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7" Apr 21 04:01:33.311994 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.311649 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 04:01:33.311994 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.311873 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7"} err="failed to get container status \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": rpc error: code = NotFound desc = could not find container \"638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7\": container with ID starting with 638680749b447ad069fa9bb85549bf87263e79f344d91ca5f8ddfe4dcfd297f7 not found: ID does not exist" Apr 21 04:01:33.311994 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.311893 2569 scope.go:117] "RemoveContainer" containerID="29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e" Apr 21 04:01:33.313749 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.313615 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e"} err="failed to get container status \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": rpc error: code = NotFound desc = could not find container \"29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e\": container with ID starting with 29e30ac8f54d730fba19cd14855bbabbc5065ab216cb1d271ae8a3ed56ab842e not found: ID does not exist" Apr 21 04:01:33.313749 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.313643 2569 scope.go:117] "RemoveContainer" containerID="efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791" Apr 21 04:01:33.313972 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.313937 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791"} err="failed to get container status \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": rpc error: code = NotFound desc = could not find container \"efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791\": container with ID starting with efd1e11c02572e0f8359c064e6079a7c345b2e5e16401f254b1ddc48cc505791 not found: ID does not exist" Apr 21 04:01:33.314059 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.313960 2569 scope.go:117] "RemoveContainer" containerID="9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652" Apr 21 04:01:33.314322 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.314288 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652"} err="failed to get container status \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": rpc error: code = NotFound desc = could not find container \"9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652\": container with ID starting with 9297c7f9423fc144ddc2647bf8df895e7ac9103006f347fde799189771119652 not found: ID does not exist" Apr 21 04:01:33.314419 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.314326 2569 scope.go:117] "RemoveContainer" containerID="e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3" Apr 21 04:01:33.315102 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.314566 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3"} err="failed to get container status \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": rpc error: code = NotFound desc = could not find container \"e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3\": container with ID starting with e6f43d14b0c7e7c2694ec19ba6b50171fe5089f2183e26f6fa2d61f85895ffc3 not found: ID does not exist" Apr 21 04:01:33.315102 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.314590 2569 scope.go:117] "RemoveContainer" containerID="6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d" Apr 21 04:01:33.315102 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.314813 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d"} err="failed to get container status \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": rpc error: code = NotFound desc = could not find container \"6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d\": container with ID starting with 6a76bf744bd1f888f73f29448af365895f41d49032b409f3bc6815310f56452d not found: ID does not exist" Apr 21 04:01:33.315102 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.314978 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 04:01:33.316262 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.316219 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:01:33.399465 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399431 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399465 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399466 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-web-config\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399670 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399670 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399503 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-config\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399670 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399543 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399670 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e66aa4b6-8f6e-4c13-b765-490a783017ca-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399670 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e66aa4b6-8f6e-4c13-b765-490a783017ca-config-out\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399670 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e66aa4b6-8f6e-4c13-b765-490a783017ca-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399670 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399646 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wwhx\" (UniqueName: \"kubernetes.io/projected/e66aa4b6-8f6e-4c13-b765-490a783017ca-kube-api-access-7wwhx\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399862 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399682 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399862 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399704 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399862 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399862 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399746 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399862 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399770 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.399862 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399810 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.400038 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.400038 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399904 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.400038 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.399943 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.469556 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.469525 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9" path="/var/lib/kubelet/pods/5e0506cf-fd9d-4769-a7c5-c90ce5ae71b9/volumes" Apr 21 04:01:33.500819 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.500819 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.500952 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500824 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.500952 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.500952 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500870 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-web-config\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.500952 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500894 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.500952 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500920 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-config\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.500952 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501173 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500975 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e66aa4b6-8f6e-4c13-b765-490a783017ca-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501173 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.500997 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e66aa4b6-8f6e-4c13-b765-490a783017ca-config-out\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501173 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.501015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e66aa4b6-8f6e-4c13-b765-490a783017ca-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501173 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.501032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wwhx\" (UniqueName: \"kubernetes.io/projected/e66aa4b6-8f6e-4c13-b765-490a783017ca-kube-api-access-7wwhx\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501173 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.501055 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501173 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.501076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501173 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.501102 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501173 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.501143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501173 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.501167 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.501619 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.501200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.502453 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.502426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.503975 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.503951 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.504642 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.504124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.504642 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.504282 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.504642 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.504316 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e66aa4b6-8f6e-4c13-b765-490a783017ca-config-out\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.504642 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.504369 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.504642 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.504589 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.504935 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.504734 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.504935 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.504745 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e66aa4b6-8f6e-4c13-b765-490a783017ca-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.504935 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.504835 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-config\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.505677 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.505647 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.506082 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.506038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.506460 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.506432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.506460 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.506448 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e66aa4b6-8f6e-4c13-b765-490a783017ca-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.506737 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.506719 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-web-config\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.507052 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.507033 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e66aa4b6-8f6e-4c13-b765-490a783017ca-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.507282 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.507266 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e66aa4b6-8f6e-4c13-b765-490a783017ca-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.512695 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.512679 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wwhx\" (UniqueName: \"kubernetes.io/projected/e66aa4b6-8f6e-4c13-b765-490a783017ca-kube-api-access-7wwhx\") pod \"prometheus-k8s-0\" (UID: \"e66aa4b6-8f6e-4c13-b765-490a783017ca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.618556 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.618533 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:01:33.744933 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:33.744910 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:01:33.746976 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:01:33.746944 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode66aa4b6_8f6e_4c13_b765_490a783017ca.slice/crio-e5061ff8a5b1537af31e4bbf17dcce8117696e24d037fc129eef037f629b202f WatchSource:0}: Error finding container e5061ff8a5b1537af31e4bbf17dcce8117696e24d037fc129eef037f629b202f: Status 404 returned error can't find the container with id e5061ff8a5b1537af31e4bbf17dcce8117696e24d037fc129eef037f629b202f Apr 21 04:01:34.253827 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:34.253795 2569 generic.go:358] "Generic (PLEG): container finished" podID="e66aa4b6-8f6e-4c13-b765-490a783017ca" containerID="301ff4857669fdce600e5877e65705503868bcc263757083c790dfb7d22f0820" exitCode=0 Apr 21 04:01:34.253958 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:34.253882 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66aa4b6-8f6e-4c13-b765-490a783017ca","Type":"ContainerDied","Data":"301ff4857669fdce600e5877e65705503868bcc263757083c790dfb7d22f0820"} Apr 21 04:01:34.253958 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:34.253911 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66aa4b6-8f6e-4c13-b765-490a783017ca","Type":"ContainerStarted","Data":"e5061ff8a5b1537af31e4bbf17dcce8117696e24d037fc129eef037f629b202f"} Apr 21 04:01:35.259747 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:35.259716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66aa4b6-8f6e-4c13-b765-490a783017ca","Type":"ContainerStarted","Data":"d9ee7a8ccee883593a05d4599f3c41d363a5396c15019dd15480bf0acdbb6a53"} Apr 21 04:01:35.259747 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:35.259748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66aa4b6-8f6e-4c13-b765-490a783017ca","Type":"ContainerStarted","Data":"8009b5e6dfa2d4860696a8fa4a4c8aec926136e300aa99fcb4d2cb4b5f669103"} Apr 21 04:01:35.260279 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:35.259757 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66aa4b6-8f6e-4c13-b765-490a783017ca","Type":"ContainerStarted","Data":"c6520f738007dabcc90abf838eff3c5bc5b07011b1cba00ad6e6b5fe8cfaf993"} Apr 21 04:01:35.260279 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:35.259766 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66aa4b6-8f6e-4c13-b765-490a783017ca","Type":"ContainerStarted","Data":"f05d3fd3d1befc7b733171763bd266714c4c2bdfe25481783f43967a7b0b899e"} Apr 21 04:01:35.260279 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:35.259774 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66aa4b6-8f6e-4c13-b765-490a783017ca","Type":"ContainerStarted","Data":"f61071fc55db8fa9510b668888e6f13766ab0668dbba645d95fb8a339a644801"} Apr 21 04:01:35.260279 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:35.259781 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e66aa4b6-8f6e-4c13-b765-490a783017ca","Type":"ContainerStarted","Data":"29516e020f8e4ea7f1d0886ee6dadad24a6502b8c80661e18c5500c8cb95ad61"} Apr 21 04:01:35.288800 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:35.288748 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.28873488 podStartE2EDuration="2.28873488s" podCreationTimestamp="2026-04-21 04:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:01:35.286338454 +0000 UTC m=+218.453734891" watchObservedRunningTime="2026-04-21 04:01:35.28873488 +0000 UTC m=+218.456131388" Apr 21 04:01:38.618845 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:01:38.618792 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:02:33.619294 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:02:33.619177 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:02:33.635616 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:02:33.635588 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:02:34.440871 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:02:34.440844 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:02:57.342875 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:02:57.342841 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 04:02:57.343545 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:02:57.342988 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 04:02:57.350071 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:02:57.350050 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 04:02:57.350451 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:02:57.350434 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 04:02:57.352946 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:02:57.352928 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:03:30.548228 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.548187 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-w476t"] Apr 21 04:03:30.550596 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.550577 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.552823 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.552803 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 04:03:30.557440 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.557416 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-w476t"] Apr 21 04:03:30.638382 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.638351 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65bde943-7b77-46db-b7e7-878233e5c51f-dbus\") pod \"global-pull-secret-syncer-w476t\" (UID: \"65bde943-7b77-46db-b7e7-878233e5c51f\") " pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.638560 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.638423 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65bde943-7b77-46db-b7e7-878233e5c51f-original-pull-secret\") pod \"global-pull-secret-syncer-w476t\" (UID: \"65bde943-7b77-46db-b7e7-878233e5c51f\") " pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.638560 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.638520 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65bde943-7b77-46db-b7e7-878233e5c51f-kubelet-config\") pod \"global-pull-secret-syncer-w476t\" (UID: \"65bde943-7b77-46db-b7e7-878233e5c51f\") " pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.739554 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.739521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65bde943-7b77-46db-b7e7-878233e5c51f-kubelet-config\") pod \"global-pull-secret-syncer-w476t\" (UID: \"65bde943-7b77-46db-b7e7-878233e5c51f\") " pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.739713 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.739568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65bde943-7b77-46db-b7e7-878233e5c51f-dbus\") pod \"global-pull-secret-syncer-w476t\" (UID: \"65bde943-7b77-46db-b7e7-878233e5c51f\") " pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.739713 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.739604 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65bde943-7b77-46db-b7e7-878233e5c51f-original-pull-secret\") pod \"global-pull-secret-syncer-w476t\" (UID: \"65bde943-7b77-46db-b7e7-878233e5c51f\") " pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.739713 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.739648 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65bde943-7b77-46db-b7e7-878233e5c51f-kubelet-config\") pod \"global-pull-secret-syncer-w476t\" (UID: \"65bde943-7b77-46db-b7e7-878233e5c51f\") " pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.739821 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.739763 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65bde943-7b77-46db-b7e7-878233e5c51f-dbus\") pod \"global-pull-secret-syncer-w476t\" (UID: \"65bde943-7b77-46db-b7e7-878233e5c51f\") " pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.742007 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.741978 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65bde943-7b77-46db-b7e7-878233e5c51f-original-pull-secret\") pod \"global-pull-secret-syncer-w476t\" (UID: \"65bde943-7b77-46db-b7e7-878233e5c51f\") " pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.861556 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.861533 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-w476t" Apr 21 04:03:30.979511 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.979394 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-w476t"] Apr 21 04:03:30.981886 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:03:30.981860 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65bde943_7b77_46db_b7e7_878233e5c51f.slice/crio-fa314678e37db98da7919c714170fe792cb6810db662e97053fcb45429c8e362 WatchSource:0}: Error finding container fa314678e37db98da7919c714170fe792cb6810db662e97053fcb45429c8e362: Status 404 returned error can't find the container with id fa314678e37db98da7919c714170fe792cb6810db662e97053fcb45429c8e362 Apr 21 04:03:30.983770 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:30.983749 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:03:31.587792 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:31.587754 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-w476t" event={"ID":"65bde943-7b77-46db-b7e7-878233e5c51f","Type":"ContainerStarted","Data":"fa314678e37db98da7919c714170fe792cb6810db662e97053fcb45429c8e362"} Apr 21 04:03:35.601251 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:35.601213 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-w476t" event={"ID":"65bde943-7b77-46db-b7e7-878233e5c51f","Type":"ContainerStarted","Data":"f10476f7368c4ad8b5df9ef05afacf988dab088853dd65913e02eb6dd0710fd6"} Apr 21 04:03:35.615449 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:03:35.615400 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-w476t" podStartSLOduration=2.070207386 podStartE2EDuration="5.615386837s" podCreationTimestamp="2026-04-21 04:03:30 +0000 UTC" firstStartedPulling="2026-04-21 04:03:30.983891223 +0000 UTC m=+334.151287639" lastFinishedPulling="2026-04-21 04:03:34.529070659 +0000 UTC m=+337.696467090" observedRunningTime="2026-04-21 04:03:35.614436278 +0000 UTC m=+338.781832717" watchObservedRunningTime="2026-04-21 04:03:35.615386837 +0000 UTC m=+338.782783278" Apr 21 04:05:33.927827 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:33.927741 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-klfjd"] Apr 21 04:05:33.930010 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:33.929994 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" Apr 21 04:05:33.932280 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:33.932252 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 04:05:33.933045 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:33.933023 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-v6p78\"" Apr 21 04:05:33.933045 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:33.933039 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 04:05:33.937890 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:33.937869 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-klfjd"] Apr 21 04:05:34.041058 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:34.041027 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jw8\" (UniqueName: \"kubernetes.io/projected/8301f913-dc81-47ac-852a-0eff81513439-kube-api-access-j7jw8\") pod \"cert-manager-cainjector-68b757865b-klfjd\" (UID: \"8301f913-dc81-47ac-852a-0eff81513439\") " pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" Apr 21 04:05:34.041196 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:34.041085 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8301f913-dc81-47ac-852a-0eff81513439-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-klfjd\" (UID: \"8301f913-dc81-47ac-852a-0eff81513439\") " pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" Apr 21 04:05:34.141895 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:34.141851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jw8\" (UniqueName: \"kubernetes.io/projected/8301f913-dc81-47ac-852a-0eff81513439-kube-api-access-j7jw8\") pod \"cert-manager-cainjector-68b757865b-klfjd\" (UID: \"8301f913-dc81-47ac-852a-0eff81513439\") " pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" Apr 21 04:05:34.142014 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:34.141933 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8301f913-dc81-47ac-852a-0eff81513439-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-klfjd\" (UID: \"8301f913-dc81-47ac-852a-0eff81513439\") " pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" Apr 21 04:05:34.149720 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:34.149697 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8301f913-dc81-47ac-852a-0eff81513439-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-klfjd\" (UID: \"8301f913-dc81-47ac-852a-0eff81513439\") " pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" Apr 21 04:05:34.149826 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:34.149740 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jw8\" (UniqueName: \"kubernetes.io/projected/8301f913-dc81-47ac-852a-0eff81513439-kube-api-access-j7jw8\") pod \"cert-manager-cainjector-68b757865b-klfjd\" (UID: \"8301f913-dc81-47ac-852a-0eff81513439\") " pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" Apr 21 04:05:34.246544 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:34.246462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" Apr 21 04:05:34.365842 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:34.365823 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-klfjd"] Apr 21 04:05:34.368066 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:05:34.368038 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8301f913_dc81_47ac_852a_0eff81513439.slice/crio-9e435e9db7ac0fec40652f4671b501314db716f706dcf7e1fe804d7193b2d4b8 WatchSource:0}: Error finding container 9e435e9db7ac0fec40652f4671b501314db716f706dcf7e1fe804d7193b2d4b8: Status 404 returned error can't find the container with id 9e435e9db7ac0fec40652f4671b501314db716f706dcf7e1fe804d7193b2d4b8 Apr 21 04:05:34.938784 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:34.938750 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" event={"ID":"8301f913-dc81-47ac-852a-0eff81513439","Type":"ContainerStarted","Data":"9e435e9db7ac0fec40652f4671b501314db716f706dcf7e1fe804d7193b2d4b8"} Apr 21 04:05:37.949609 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:37.949572 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" event={"ID":"8301f913-dc81-47ac-852a-0eff81513439","Type":"ContainerStarted","Data":"10badc4b9d06e86440b04825a0d9f74823a7731c83e72ea4152a1c57e77013f0"} Apr 21 04:05:37.965370 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:37.965319 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-klfjd" podStartSLOduration=1.81973135 podStartE2EDuration="4.965306447s" podCreationTimestamp="2026-04-21 04:05:33 +0000 UTC" firstStartedPulling="2026-04-21 04:05:34.369819282 +0000 UTC m=+457.537215698" lastFinishedPulling="2026-04-21 04:05:37.515394357 +0000 UTC m=+460.682790795" observedRunningTime="2026-04-21 04:05:37.963702055 +0000 UTC m=+461.131098493" watchObservedRunningTime="2026-04-21 04:05:37.965306447 +0000 UTC m=+461.132702884" Apr 21 04:05:48.520092 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.520055 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-sds5c"] Apr 21 04:05:48.523260 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.523229 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-sds5c" Apr 21 04:05:48.525534 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.525515 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-2m6b6\"" Apr 21 04:05:48.530346 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.530321 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-sds5c"] Apr 21 04:05:48.574956 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.574921 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxfzz\" (UniqueName: \"kubernetes.io/projected/5bd1f864-eb96-435e-9fca-fa2d28a72d34-kube-api-access-hxfzz\") pod \"cert-manager-79c8d999ff-sds5c\" (UID: \"5bd1f864-eb96-435e-9fca-fa2d28a72d34\") " pod="cert-manager/cert-manager-79c8d999ff-sds5c" Apr 21 04:05:48.575109 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.574980 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5bd1f864-eb96-435e-9fca-fa2d28a72d34-bound-sa-token\") pod \"cert-manager-79c8d999ff-sds5c\" (UID: \"5bd1f864-eb96-435e-9fca-fa2d28a72d34\") " pod="cert-manager/cert-manager-79c8d999ff-sds5c" Apr 21 04:05:48.675725 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.675691 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5bd1f864-eb96-435e-9fca-fa2d28a72d34-bound-sa-token\") pod \"cert-manager-79c8d999ff-sds5c\" (UID: \"5bd1f864-eb96-435e-9fca-fa2d28a72d34\") " pod="cert-manager/cert-manager-79c8d999ff-sds5c" Apr 21 04:05:48.675898 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.675739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxfzz\" (UniqueName: \"kubernetes.io/projected/5bd1f864-eb96-435e-9fca-fa2d28a72d34-kube-api-access-hxfzz\") pod \"cert-manager-79c8d999ff-sds5c\" (UID: \"5bd1f864-eb96-435e-9fca-fa2d28a72d34\") " pod="cert-manager/cert-manager-79c8d999ff-sds5c" Apr 21 04:05:48.683056 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.683026 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5bd1f864-eb96-435e-9fca-fa2d28a72d34-bound-sa-token\") pod \"cert-manager-79c8d999ff-sds5c\" (UID: \"5bd1f864-eb96-435e-9fca-fa2d28a72d34\") " pod="cert-manager/cert-manager-79c8d999ff-sds5c" Apr 21 04:05:48.683166 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.683037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxfzz\" (UniqueName: \"kubernetes.io/projected/5bd1f864-eb96-435e-9fca-fa2d28a72d34-kube-api-access-hxfzz\") pod \"cert-manager-79c8d999ff-sds5c\" (UID: \"5bd1f864-eb96-435e-9fca-fa2d28a72d34\") " pod="cert-manager/cert-manager-79c8d999ff-sds5c" Apr 21 04:05:48.833471 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.833388 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-sds5c" Apr 21 04:05:48.948753 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.948725 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-sds5c"] Apr 21 04:05:48.952138 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:05:48.952114 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bd1f864_eb96_435e_9fca_fa2d28a72d34.slice/crio-9cc490a192765711c1d2ab1ec559cca32bf634c2d80fff5c914d3534f51c4499 WatchSource:0}: Error finding container 9cc490a192765711c1d2ab1ec559cca32bf634c2d80fff5c914d3534f51c4499: Status 404 returned error can't find the container with id 9cc490a192765711c1d2ab1ec559cca32bf634c2d80fff5c914d3534f51c4499 Apr 21 04:05:48.982926 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:48.982898 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-sds5c" event={"ID":"5bd1f864-eb96-435e-9fca-fa2d28a72d34","Type":"ContainerStarted","Data":"9cc490a192765711c1d2ab1ec559cca32bf634c2d80fff5c914d3534f51c4499"} Apr 21 04:05:49.987383 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:49.987346 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-sds5c" event={"ID":"5bd1f864-eb96-435e-9fca-fa2d28a72d34","Type":"ContainerStarted","Data":"843e2a85f1d95cc1250e8253aa82d119b6a704f7a06ac9cfdf6adb21edc343e1"} Apr 21 04:05:50.008161 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:05:50.008118 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-sds5c" podStartSLOduration=2.008104707 podStartE2EDuration="2.008104707s" podCreationTimestamp="2026-04-21 04:05:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:05:50.005882161 +0000 UTC m=+473.173278613" watchObservedRunningTime="2026-04-21 04:05:50.008104707 +0000 UTC m=+473.175501146" Apr 21 04:06:20.000969 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.000937 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6"] Apr 21 04:06:20.007918 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.007898 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.015803 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.012677 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 04:06:20.015803 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.012928 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-xl2bs\"" Apr 21 04:06:20.015803 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.013096 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:06:20.015803 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.013357 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 04:06:20.015803 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.013527 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 04:06:20.015803 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.013672 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 04:06:20.016581 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.016554 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6"] Apr 21 04:06:20.137900 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.137864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/06e5a302-848b-4777-8019-47965aa5da8e-metrics-cert\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.138059 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.137915 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06e5a302-848b-4777-8019-47965aa5da8e-cert\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.138059 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.137995 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mhb6\" (UniqueName: \"kubernetes.io/projected/06e5a302-848b-4777-8019-47965aa5da8e-kube-api-access-6mhb6\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.138059 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.138052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/06e5a302-848b-4777-8019-47965aa5da8e-manager-config\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.239219 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.239182 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06e5a302-848b-4777-8019-47965aa5da8e-cert\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.239427 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.239230 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mhb6\" (UniqueName: \"kubernetes.io/projected/06e5a302-848b-4777-8019-47965aa5da8e-kube-api-access-6mhb6\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.239427 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.239285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/06e5a302-848b-4777-8019-47965aa5da8e-manager-config\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.239427 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.239322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/06e5a302-848b-4777-8019-47965aa5da8e-metrics-cert\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.240042 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.240016 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/06e5a302-848b-4777-8019-47965aa5da8e-manager-config\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.241863 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.241841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06e5a302-848b-4777-8019-47965aa5da8e-cert\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.241947 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.241896 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/06e5a302-848b-4777-8019-47965aa5da8e-metrics-cert\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.248463 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.248438 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mhb6\" (UniqueName: \"kubernetes.io/projected/06e5a302-848b-4777-8019-47965aa5da8e-kube-api-access-6mhb6\") pod \"lws-controller-manager-57f75ff788-sk2b6\" (UID: \"06e5a302-848b-4777-8019-47965aa5da8e\") " pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.322631 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.322549 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:20.443487 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:20.443462 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6"] Apr 21 04:06:20.446290 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:06:20.446263 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06e5a302_848b_4777_8019_47965aa5da8e.slice/crio-858dc6d35c1c816b556d3760301d2afbb2f0eaaefb2d314639da8fe57321eec3 WatchSource:0}: Error finding container 858dc6d35c1c816b556d3760301d2afbb2f0eaaefb2d314639da8fe57321eec3: Status 404 returned error can't find the container with id 858dc6d35c1c816b556d3760301d2afbb2f0eaaefb2d314639da8fe57321eec3 Apr 21 04:06:21.087364 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:21.087328 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" event={"ID":"06e5a302-848b-4777-8019-47965aa5da8e","Type":"ContainerStarted","Data":"858dc6d35c1c816b556d3760301d2afbb2f0eaaefb2d314639da8fe57321eec3"} Apr 21 04:06:23.096322 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:23.096288 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" event={"ID":"06e5a302-848b-4777-8019-47965aa5da8e","Type":"ContainerStarted","Data":"ce43a1ec0f83750db1a29746410ecc03076e19c49d4fa4abbd791699357d5ea7"} Apr 21 04:06:23.096656 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:23.096484 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:06:23.113486 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:23.113416 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" podStartSLOduration=1.540960146 podStartE2EDuration="4.113402984s" podCreationTimestamp="2026-04-21 04:06:19 +0000 UTC" firstStartedPulling="2026-04-21 04:06:20.448300106 +0000 UTC m=+503.615696523" lastFinishedPulling="2026-04-21 04:06:23.020742933 +0000 UTC m=+506.188139361" observedRunningTime="2026-04-21 04:06:23.111847673 +0000 UTC m=+506.279244113" watchObservedRunningTime="2026-04-21 04:06:23.113402984 +0000 UTC m=+506.280799423" Apr 21 04:06:34.102197 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:06:34.102171 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-57f75ff788-sk2b6" Apr 21 04:07:03.063724 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.063687 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj"] Apr 21 04:07:03.068416 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.068398 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" Apr 21 04:07:03.071049 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.071030 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:07:03.071166 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.071066 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 04:07:03.071539 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.071521 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:07:03.071872 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.071859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-td6rv\"" Apr 21 04:07:03.077867 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.077832 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8lxd\" (UniqueName: \"kubernetes.io/projected/7471ff18-2c0c-46ed-8329-84f532a99914-kube-api-access-s8lxd\") pod \"dns-operator-controller-manager-844548ff4c-znjzj\" (UID: \"7471ff18-2c0c-46ed-8329-84f532a99914\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" Apr 21 04:07:03.077990 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.077974 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj"] Apr 21 04:07:03.178766 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.178726 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8lxd\" (UniqueName: \"kubernetes.io/projected/7471ff18-2c0c-46ed-8329-84f532a99914-kube-api-access-s8lxd\") pod \"dns-operator-controller-manager-844548ff4c-znjzj\" (UID: \"7471ff18-2c0c-46ed-8329-84f532a99914\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" Apr 21 04:07:03.195429 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.195396 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8lxd\" (UniqueName: \"kubernetes.io/projected/7471ff18-2c0c-46ed-8329-84f532a99914-kube-api-access-s8lxd\") pod \"dns-operator-controller-manager-844548ff4c-znjzj\" (UID: \"7471ff18-2c0c-46ed-8329-84f532a99914\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" Apr 21 04:07:03.385514 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.385483 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" Apr 21 04:07:03.512635 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.512608 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj"] Apr 21 04:07:03.515402 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:07:03.515376 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7471ff18_2c0c_46ed_8329_84f532a99914.slice/crio-c902dbb3f5d7689bde6c889aba0cae215710b43125830303b8705203fa58cd33 WatchSource:0}: Error finding container c902dbb3f5d7689bde6c889aba0cae215710b43125830303b8705203fa58cd33: Status 404 returned error can't find the container with id c902dbb3f5d7689bde6c889aba0cae215710b43125830303b8705203fa58cd33 Apr 21 04:07:03.707959 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.707881 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8"] Apr 21 04:07:03.711597 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.711568 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:03.714026 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.714005 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9vhlf\"" Apr 21 04:07:03.714747 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.714728 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 04:07:03.716776 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.716628 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 04:07:03.719538 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.719512 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8"] Apr 21 04:07:03.784014 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.783970 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/493f846f-1fb5-4f44-b4d6-983bc06410cb-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-wcxv8\" (UID: \"493f846f-1fb5-4f44-b4d6-983bc06410cb\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:03.784014 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.784024 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/493f846f-1fb5-4f44-b4d6-983bc06410cb-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-wcxv8\" (UID: \"493f846f-1fb5-4f44-b4d6-983bc06410cb\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:03.784314 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.784125 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8577g\" (UniqueName: \"kubernetes.io/projected/493f846f-1fb5-4f44-b4d6-983bc06410cb-kube-api-access-8577g\") pod \"kuadrant-console-plugin-6c886788f8-wcxv8\" (UID: \"493f846f-1fb5-4f44-b4d6-983bc06410cb\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:03.885309 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.885275 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8577g\" (UniqueName: \"kubernetes.io/projected/493f846f-1fb5-4f44-b4d6-983bc06410cb-kube-api-access-8577g\") pod \"kuadrant-console-plugin-6c886788f8-wcxv8\" (UID: \"493f846f-1fb5-4f44-b4d6-983bc06410cb\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:03.885544 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.885350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/493f846f-1fb5-4f44-b4d6-983bc06410cb-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-wcxv8\" (UID: \"493f846f-1fb5-4f44-b4d6-983bc06410cb\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:03.885544 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.885381 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/493f846f-1fb5-4f44-b4d6-983bc06410cb-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-wcxv8\" (UID: \"493f846f-1fb5-4f44-b4d6-983bc06410cb\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:03.885917 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.885900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/493f846f-1fb5-4f44-b4d6-983bc06410cb-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-wcxv8\" (UID: \"493f846f-1fb5-4f44-b4d6-983bc06410cb\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:03.887931 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.887909 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/493f846f-1fb5-4f44-b4d6-983bc06410cb-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-wcxv8\" (UID: \"493f846f-1fb5-4f44-b4d6-983bc06410cb\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:03.896218 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:03.896198 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8577g\" (UniqueName: \"kubernetes.io/projected/493f846f-1fb5-4f44-b4d6-983bc06410cb-kube-api-access-8577g\") pod \"kuadrant-console-plugin-6c886788f8-wcxv8\" (UID: \"493f846f-1fb5-4f44-b4d6-983bc06410cb\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:04.023631 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:04.023560 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" Apr 21 04:07:04.140858 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:04.140834 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8"] Apr 21 04:07:04.143200 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:07:04.143172 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493f846f_1fb5_4f44_b4d6_983bc06410cb.slice/crio-948f676118adad0501f7368363a795a5807e9a2f4c4db0361ce5d00d8e6ea199 WatchSource:0}: Error finding container 948f676118adad0501f7368363a795a5807e9a2f4c4db0361ce5d00d8e6ea199: Status 404 returned error can't find the container with id 948f676118adad0501f7368363a795a5807e9a2f4c4db0361ce5d00d8e6ea199 Apr 21 04:07:04.224403 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:04.224371 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" event={"ID":"7471ff18-2c0c-46ed-8329-84f532a99914","Type":"ContainerStarted","Data":"c902dbb3f5d7689bde6c889aba0cae215710b43125830303b8705203fa58cd33"} Apr 21 04:07:04.225609 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:04.225577 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" event={"ID":"493f846f-1fb5-4f44-b4d6-983bc06410cb","Type":"ContainerStarted","Data":"948f676118adad0501f7368363a795a5807e9a2f4c4db0361ce5d00d8e6ea199"} Apr 21 04:07:07.239817 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:07.239779 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" event={"ID":"7471ff18-2c0c-46ed-8329-84f532a99914","Type":"ContainerStarted","Data":"6564e2c60e76ff0919210ef78897df58bb6532f9895bff3d2ff9c2caccf49dd8"} Apr 21 04:07:07.240297 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:07.239912 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" Apr 21 04:07:07.260980 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:07.260930 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" podStartSLOduration=1.288493533 podStartE2EDuration="4.260913166s" podCreationTimestamp="2026-04-21 04:07:03 +0000 UTC" firstStartedPulling="2026-04-21 04:07:03.517452182 +0000 UTC m=+546.684848597" lastFinishedPulling="2026-04-21 04:07:06.489871811 +0000 UTC m=+549.657268230" observedRunningTime="2026-04-21 04:07:07.259657631 +0000 UTC m=+550.427054069" watchObservedRunningTime="2026-04-21 04:07:07.260913166 +0000 UTC m=+550.428309616" Apr 21 04:07:09.249440 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:09.249406 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" event={"ID":"493f846f-1fb5-4f44-b4d6-983bc06410cb","Type":"ContainerStarted","Data":"b6ee3a8591b5e21039186df17dc82489aa646db876a8817345284d26c4c16449"} Apr 21 04:07:09.264689 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:09.264641 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-wcxv8" podStartSLOduration=1.37331545 podStartE2EDuration="6.264627681s" podCreationTimestamp="2026-04-21 04:07:03 +0000 UTC" firstStartedPulling="2026-04-21 04:07:04.144492713 +0000 UTC m=+547.311889128" lastFinishedPulling="2026-04-21 04:07:09.03580494 +0000 UTC m=+552.203201359" observedRunningTime="2026-04-21 04:07:09.263036833 +0000 UTC m=+552.430433290" watchObservedRunningTime="2026-04-21 04:07:09.264627681 +0000 UTC m=+552.432024119" Apr 21 04:07:18.247199 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:18.247164 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-znjzj" Apr 21 04:07:45.282591 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.282551 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wk8zt"] Apr 21 04:07:45.308774 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.308743 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wk8zt"] Apr 21 04:07:45.308930 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.308866 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:07:45.311147 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.311124 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 04:07:45.349499 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.349466 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szcxt\" (UniqueName: \"kubernetes.io/projected/4fab00ae-2865-42dc-8622-ca24543bddec-kube-api-access-szcxt\") pod \"limitador-limitador-64c8f475fb-wk8zt\" (UID: \"4fab00ae-2865-42dc-8622-ca24543bddec\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:07:45.349640 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.349508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4fab00ae-2865-42dc-8622-ca24543bddec-config-file\") pod \"limitador-limitador-64c8f475fb-wk8zt\" (UID: \"4fab00ae-2865-42dc-8622-ca24543bddec\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:07:45.387019 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.386991 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wk8zt"] Apr 21 04:07:45.450184 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.450153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szcxt\" (UniqueName: \"kubernetes.io/projected/4fab00ae-2865-42dc-8622-ca24543bddec-kube-api-access-szcxt\") pod \"limitador-limitador-64c8f475fb-wk8zt\" (UID: \"4fab00ae-2865-42dc-8622-ca24543bddec\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:07:45.450319 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.450195 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4fab00ae-2865-42dc-8622-ca24543bddec-config-file\") pod \"limitador-limitador-64c8f475fb-wk8zt\" (UID: \"4fab00ae-2865-42dc-8622-ca24543bddec\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:07:45.450769 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.450753 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4fab00ae-2865-42dc-8622-ca24543bddec-config-file\") pod \"limitador-limitador-64c8f475fb-wk8zt\" (UID: \"4fab00ae-2865-42dc-8622-ca24543bddec\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:07:45.457969 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.457943 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szcxt\" (UniqueName: \"kubernetes.io/projected/4fab00ae-2865-42dc-8622-ca24543bddec-kube-api-access-szcxt\") pod \"limitador-limitador-64c8f475fb-wk8zt\" (UID: \"4fab00ae-2865-42dc-8622-ca24543bddec\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:07:45.618962 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.618933 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:07:45.739417 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:45.739393 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wk8zt"] Apr 21 04:07:45.742039 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:07:45.742013 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fab00ae_2865_42dc_8622_ca24543bddec.slice/crio-923530f7046c70fedd095abd6cf93abab11e9eb48802422baf49c36250720acd WatchSource:0}: Error finding container 923530f7046c70fedd095abd6cf93abab11e9eb48802422baf49c36250720acd: Status 404 returned error can't find the container with id 923530f7046c70fedd095abd6cf93abab11e9eb48802422baf49c36250720acd Apr 21 04:07:46.369019 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:46.368984 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" event={"ID":"4fab00ae-2865-42dc-8622-ca24543bddec","Type":"ContainerStarted","Data":"923530f7046c70fedd095abd6cf93abab11e9eb48802422baf49c36250720acd"} Apr 21 04:07:47.376342 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:47.376306 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" event={"ID":"4fab00ae-2865-42dc-8622-ca24543bddec","Type":"ContainerStarted","Data":"759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564"} Apr 21 04:07:47.376740 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:47.376430 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:07:47.394885 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:47.394809 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" podStartSLOduration=0.885634309 podStartE2EDuration="2.394792565s" podCreationTimestamp="2026-04-21 04:07:45 +0000 UTC" firstStartedPulling="2026-04-21 04:07:45.743974757 +0000 UTC m=+588.911371173" lastFinishedPulling="2026-04-21 04:07:47.253133007 +0000 UTC m=+590.420529429" observedRunningTime="2026-04-21 04:07:47.393427508 +0000 UTC m=+590.560823945" watchObservedRunningTime="2026-04-21 04:07:47.394792565 +0000 UTC m=+590.562189021" Apr 21 04:07:57.367667 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:57.367639 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 04:07:57.368054 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:57.367648 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 04:07:57.373989 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:57.373969 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 04:07:57.374170 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:57.374153 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 04:07:58.386149 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:07:58.386114 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:12:45.382977 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:45.382942 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wk8zt"] Apr 21 04:12:45.383538 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:45.383163 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" podUID="4fab00ae-2865-42dc-8622-ca24543bddec" containerName="limitador" containerID="cri-o://759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564" gracePeriod=30 Apr 21 04:12:46.320329 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.320299 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:12:46.353989 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.353966 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4fab00ae-2865-42dc-8622-ca24543bddec-config-file\") pod \"4fab00ae-2865-42dc-8622-ca24543bddec\" (UID: \"4fab00ae-2865-42dc-8622-ca24543bddec\") " Apr 21 04:12:46.354111 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.354060 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szcxt\" (UniqueName: \"kubernetes.io/projected/4fab00ae-2865-42dc-8622-ca24543bddec-kube-api-access-szcxt\") pod \"4fab00ae-2865-42dc-8622-ca24543bddec\" (UID: \"4fab00ae-2865-42dc-8622-ca24543bddec\") " Apr 21 04:12:46.354345 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.354321 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fab00ae-2865-42dc-8622-ca24543bddec-config-file" (OuterVolumeSpecName: "config-file") pod "4fab00ae-2865-42dc-8622-ca24543bddec" (UID: "4fab00ae-2865-42dc-8622-ca24543bddec"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:12:46.356356 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.356330 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fab00ae-2865-42dc-8622-ca24543bddec-kube-api-access-szcxt" (OuterVolumeSpecName: "kube-api-access-szcxt") pod "4fab00ae-2865-42dc-8622-ca24543bddec" (UID: "4fab00ae-2865-42dc-8622-ca24543bddec"). InnerVolumeSpecName "kube-api-access-szcxt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:12:46.369989 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.369963 2569 generic.go:358] "Generic (PLEG): container finished" podID="4fab00ae-2865-42dc-8622-ca24543bddec" containerID="759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564" exitCode=0 Apr 21 04:12:46.370079 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.370007 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" event={"ID":"4fab00ae-2865-42dc-8622-ca24543bddec","Type":"ContainerDied","Data":"759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564"} Apr 21 04:12:46.370079 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.370024 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" Apr 21 04:12:46.370079 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.370042 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-wk8zt" event={"ID":"4fab00ae-2865-42dc-8622-ca24543bddec","Type":"ContainerDied","Data":"923530f7046c70fedd095abd6cf93abab11e9eb48802422baf49c36250720acd"} Apr 21 04:12:46.370079 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.370059 2569 scope.go:117] "RemoveContainer" containerID="759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564" Apr 21 04:12:46.378089 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.378070 2569 scope.go:117] "RemoveContainer" containerID="759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564" Apr 21 04:12:46.378339 ip-10-0-138-120 kubenswrapper[2569]: E0421 04:12:46.378314 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564\": container with ID starting with 759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564 not found: ID does not exist" containerID="759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564" Apr 21 04:12:46.378394 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.378354 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564"} err="failed to get container status \"759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564\": rpc error: code = NotFound desc = could not find container \"759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564\": container with ID starting with 759883e04b28230932256306bf7e351c22c5fc1888512553c5959ba94e162564 not found: ID does not exist" Apr 21 04:12:46.388486 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.388461 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wk8zt"] Apr 21 04:12:46.391346 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.391328 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-wk8zt"] Apr 21 04:12:46.455041 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.454990 2569 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4fab00ae-2865-42dc-8622-ca24543bddec-config-file\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:12:46.455041 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:46.455015 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-szcxt\" (UniqueName: \"kubernetes.io/projected/4fab00ae-2865-42dc-8622-ca24543bddec-kube-api-access-szcxt\") on node \"ip-10-0-138-120.ec2.internal\" DevicePath \"\"" Apr 21 04:12:47.474065 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:47.474028 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fab00ae-2865-42dc-8622-ca24543bddec" path="/var/lib/kubelet/pods/4fab00ae-2865-42dc-8622-ca24543bddec/volumes" Apr 21 04:12:57.391559 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:57.391530 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 04:12:57.392217 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:57.392202 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 04:12:57.397990 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:57.397971 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 04:12:57.398370 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:12:57.398352 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 04:13:00.658480 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.658447 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-49zxh"] Apr 21 04:13:00.658834 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.658821 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fab00ae-2865-42dc-8622-ca24543bddec" containerName="limitador" Apr 21 04:13:00.658920 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.658836 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fab00ae-2865-42dc-8622-ca24543bddec" containerName="limitador" Apr 21 04:13:00.658920 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.658888 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fab00ae-2865-42dc-8622-ca24543bddec" containerName="limitador" Apr 21 04:13:00.663287 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.663268 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:13:00.665905 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.665882 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 04:13:00.667476 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.667450 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-49zxh"] Apr 21 04:13:00.690486 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.690461 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-49zxh"] Apr 21 04:13:00.778667 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.778631 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b6d4d710-3c7e-402b-941e-091ae6c02f78-config-file\") pod \"limitador-limitador-64c8f475fb-49zxh\" (UID: \"b6d4d710-3c7e-402b-941e-091ae6c02f78\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:13:00.778831 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.778695 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6lf9\" (UniqueName: \"kubernetes.io/projected/b6d4d710-3c7e-402b-941e-091ae6c02f78-kube-api-access-c6lf9\") pod \"limitador-limitador-64c8f475fb-49zxh\" (UID: \"b6d4d710-3c7e-402b-941e-091ae6c02f78\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:13:00.879319 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.879283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b6d4d710-3c7e-402b-941e-091ae6c02f78-config-file\") pod \"limitador-limitador-64c8f475fb-49zxh\" (UID: \"b6d4d710-3c7e-402b-941e-091ae6c02f78\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:13:00.879463 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.879345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6lf9\" (UniqueName: \"kubernetes.io/projected/b6d4d710-3c7e-402b-941e-091ae6c02f78-kube-api-access-c6lf9\") pod \"limitador-limitador-64c8f475fb-49zxh\" (UID: \"b6d4d710-3c7e-402b-941e-091ae6c02f78\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:13:00.879868 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.879848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b6d4d710-3c7e-402b-941e-091ae6c02f78-config-file\") pod \"limitador-limitador-64c8f475fb-49zxh\" (UID: \"b6d4d710-3c7e-402b-941e-091ae6c02f78\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:13:00.887535 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.887507 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6lf9\" (UniqueName: \"kubernetes.io/projected/b6d4d710-3c7e-402b-941e-091ae6c02f78-kube-api-access-c6lf9\") pod \"limitador-limitador-64c8f475fb-49zxh\" (UID: \"b6d4d710-3c7e-402b-941e-091ae6c02f78\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:13:00.973897 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:00.973821 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:13:01.091274 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:01.091230 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-49zxh"] Apr 21 04:13:01.093347 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:13:01.093308 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6d4d710_3c7e_402b_941e_091ae6c02f78.slice/crio-5d48fa976c19496cac613c0bb85b4eb0f794e64ca91159f599f6f5dd8863a469 WatchSource:0}: Error finding container 5d48fa976c19496cac613c0bb85b4eb0f794e64ca91159f599f6f5dd8863a469: Status 404 returned error can't find the container with id 5d48fa976c19496cac613c0bb85b4eb0f794e64ca91159f599f6f5dd8863a469 Apr 21 04:13:01.095016 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:01.094999 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:13:01.420680 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:01.420648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" event={"ID":"b6d4d710-3c7e-402b-941e-091ae6c02f78","Type":"ContainerStarted","Data":"e81ec40b9121d6c390ebb76779c001a06e1d5a66404aaed505451491eece0384"} Apr 21 04:13:01.420680 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:01.420682 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" event={"ID":"b6d4d710-3c7e-402b-941e-091ae6c02f78","Type":"ContainerStarted","Data":"5d48fa976c19496cac613c0bb85b4eb0f794e64ca91159f599f6f5dd8863a469"} Apr 21 04:13:01.420890 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:01.420768 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:13:01.436371 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:01.436332 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" podStartSLOduration=1.43631957 podStartE2EDuration="1.43631957s" podCreationTimestamp="2026-04-21 04:13:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:13:01.434741168 +0000 UTC m=+904.602137606" watchObservedRunningTime="2026-04-21 04:13:01.43631957 +0000 UTC m=+904.603716008" Apr 21 04:13:12.424140 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:13:12.424100 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-49zxh" Apr 21 04:17:57.413578 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:17:57.413545 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 04:17:57.415709 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:17:57.415690 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 04:17:57.420027 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:17:57.420011 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 04:17:57.421982 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:17:57.421966 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 04:18:29.885141 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:29.885096 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-w476t_65bde943-7b77-46db-b7e7-878233e5c51f/global-pull-secret-syncer/0.log" Apr 21 04:18:29.959065 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:29.959036 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fs2vm_5c3da277-f4fb-47af-9b13-a2de86f37142/konnectivity-agent/0.log" Apr 21 04:18:30.040898 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:30.040852 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-120.ec2.internal_4c5556f6af36052a906fa0aef20bfb6c/haproxy/0.log" Apr 21 04:18:34.001445 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:34.001416 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-znjzj_7471ff18-2c0c-46ed-8329-84f532a99914/manager/0.log" Apr 21 04:18:34.025445 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:34.025420 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-wcxv8_493f846f-1fb5-4f44-b4d6-983bc06410cb/kuadrant-console-plugin/0.log" Apr 21 04:18:34.084871 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:34.084847 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-64c8f475fb-49zxh_b6d4d710-3c7e-402b-941e-091ae6c02f78/limitador/0.log" Apr 21 04:18:35.231819 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.231779 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4knsl_7f787778-1993-43c5-97a5-e8a841e298b8/kube-state-metrics/0.log" Apr 21 04:18:35.249798 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.249766 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4knsl_7f787778-1993-43c5-97a5-e8a841e298b8/kube-rbac-proxy-main/0.log" Apr 21 04:18:35.271926 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.271896 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-4knsl_7f787778-1993-43c5-97a5-e8a841e298b8/kube-rbac-proxy-self/0.log" Apr 21 04:18:35.358315 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.358291 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5djqb_87911e74-ac2f-440d-ae73-34b23d6d9a70/node-exporter/0.log" Apr 21 04:18:35.376991 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.376972 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5djqb_87911e74-ac2f-440d-ae73-34b23d6d9a70/kube-rbac-proxy/0.log" Apr 21 04:18:35.397500 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.397480 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5djqb_87911e74-ac2f-440d-ae73-34b23d6d9a70/init-textfile/0.log" Apr 21 04:18:35.574577 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.574552 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-95tf9_cae1cda9-6e23-44ac-a8f0-2e07752524d8/kube-rbac-proxy-main/0.log" Apr 21 04:18:35.598181 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.598158 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-95tf9_cae1cda9-6e23-44ac-a8f0-2e07752524d8/kube-rbac-proxy-self/0.log" Apr 21 04:18:35.618442 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.618377 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-95tf9_cae1cda9-6e23-44ac-a8f0-2e07752524d8/openshift-state-metrics/0.log" Apr 21 04:18:35.657613 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.657588 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66aa4b6-8f6e-4c13-b765-490a783017ca/prometheus/0.log" Apr 21 04:18:35.674920 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.674897 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66aa4b6-8f6e-4c13-b765-490a783017ca/config-reloader/0.log" Apr 21 04:18:35.702508 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.702482 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66aa4b6-8f6e-4c13-b765-490a783017ca/thanos-sidecar/0.log" Apr 21 04:18:35.723286 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.723265 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66aa4b6-8f6e-4c13-b765-490a783017ca/kube-rbac-proxy-web/0.log" Apr 21 04:18:35.748090 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.748064 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66aa4b6-8f6e-4c13-b765-490a783017ca/kube-rbac-proxy/0.log" Apr 21 04:18:35.775795 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.775763 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66aa4b6-8f6e-4c13-b765-490a783017ca/kube-rbac-proxy-thanos/0.log" Apr 21 04:18:35.806608 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.806579 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e66aa4b6-8f6e-4c13-b765-490a783017ca/init-config-reloader/0.log" Apr 21 04:18:35.889337 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:35.889277 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-w6x4f_8a74424a-0a97-406e-a9ab-19b6f495be26/prometheus-operator-admission-webhook/0.log" Apr 21 04:18:37.756554 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:37.756528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/1.log" Apr 21 04:18:37.766000 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:37.765970 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-cjfz9_6f274aa3-e379-471c-9815-cb6212a1b1ef/console-operator/2.log" Apr 21 04:18:38.679491 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.679460 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-8g58t_54eef815-d11b-4203-b390-7baf2c44b620/volume-data-source-validator/0.log" Apr 21 04:18:38.708499 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.708473 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt"] Apr 21 04:18:38.711988 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.711968 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.714286 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.714265 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dhdf9\"/\"default-dockercfg-h8wlt\"" Apr 21 04:18:38.714394 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.714305 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dhdf9\"/\"kube-root-ca.crt\"" Apr 21 04:18:38.715136 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.715118 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dhdf9\"/\"openshift-service-ca.crt\"" Apr 21 04:18:38.721068 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.721045 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt"] Apr 21 04:18:38.772131 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.772100 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-sys\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.772131 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.772128 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-proc\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.772575 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.772148 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvlrg\" (UniqueName: \"kubernetes.io/projected/6cff989c-b633-4771-ad4d-d42f5aff3c4f-kube-api-access-tvlrg\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.772575 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.772291 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-lib-modules\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.772575 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.772314 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-podres\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.873663 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.873626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-lib-modules\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.873663 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.873660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-podres\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.873894 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.873716 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-sys\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.873894 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.873737 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-proc\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.873894 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.873797 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-proc\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.873894 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.873803 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-lib-modules\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.873894 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.873841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-sys\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.873894 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.873853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6cff989c-b633-4771-ad4d-d42f5aff3c4f-podres\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.874097 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.873899 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlrg\" (UniqueName: \"kubernetes.io/projected/6cff989c-b633-4771-ad4d-d42f5aff3c4f-kube-api-access-tvlrg\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:38.882062 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:38.882043 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvlrg\" (UniqueName: \"kubernetes.io/projected/6cff989c-b633-4771-ad4d-d42f5aff3c4f-kube-api-access-tvlrg\") pod \"perf-node-gather-daemonset-ddhtt\" (UID: \"6cff989c-b633-4771-ad4d-d42f5aff3c4f\") " pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:39.022911 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.022821 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:39.144121 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.144094 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt"] Apr 21 04:18:39.146763 ip-10-0-138-120 kubenswrapper[2569]: W0421 04:18:39.146728 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6cff989c_b633_4771_ad4d_d42f5aff3c4f.slice/crio-1153e045fd2096be719f11fbfa53a5d55572c50a9c38470d3db1aaa4b9314284 WatchSource:0}: Error finding container 1153e045fd2096be719f11fbfa53a5d55572c50a9c38470d3db1aaa4b9314284: Status 404 returned error can't find the container with id 1153e045fd2096be719f11fbfa53a5d55572c50a9c38470d3db1aaa4b9314284 Apr 21 04:18:39.148560 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.148543 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:18:39.477058 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.477035 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p2sxx_582f3951-c97b-47a0-9cb1-39d85f78b692/dns/0.log" Apr 21 04:18:39.495846 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.495824 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p2sxx_582f3951-c97b-47a0-9cb1-39d85f78b692/kube-rbac-proxy/0.log" Apr 21 04:18:39.520417 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.520388 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" event={"ID":"6cff989c-b633-4771-ad4d-d42f5aff3c4f","Type":"ContainerStarted","Data":"d9267d64c13d5b867a170346953011197cc3b1ba71bd5c7d710eb84a0a8aae1e"} Apr 21 04:18:39.520417 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.520417 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" event={"ID":"6cff989c-b633-4771-ad4d-d42f5aff3c4f","Type":"ContainerStarted","Data":"1153e045fd2096be719f11fbfa53a5d55572c50a9c38470d3db1aaa4b9314284"} Apr 21 04:18:39.520549 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.520511 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:39.535441 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.535395 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" podStartSLOduration=1.535379082 podStartE2EDuration="1.535379082s" podCreationTimestamp="2026-04-21 04:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:18:39.534509686 +0000 UTC m=+1242.701906124" watchObservedRunningTime="2026-04-21 04:18:39.535379082 +0000 UTC m=+1242.702775523" Apr 21 04:18:39.560676 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.560654 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-r5zq9_276dfbb7-a93a-4da8-8b3b-f919c9642bca/dns-node-resolver/0.log" Apr 21 04:18:39.998707 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:39.998659 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7c98bb8c4b-66m2p_9a908e1d-c424-42bf-9ffc-a48edafa8faa/registry/0.log" Apr 21 04:18:40.019052 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:40.019023 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-bw45p_e7f6204a-5b3d-4f0f-8b89-3111e460af8a/node-ca/0.log" Apr 21 04:18:41.343799 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:41.343771 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xmfjq_4568e112-4a47-4610-a776-ed8ac69d2ca9/serve-healthcheck-canary/0.log" Apr 21 04:18:41.735189 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:41.735114 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-657t5_2fea1f5b-33a1-4b14-a6f9-dc99d97673d2/insights-operator/0.log" Apr 21 04:18:41.735545 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:41.735528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-657t5_2fea1f5b-33a1-4b14-a6f9-dc99d97673d2/insights-operator/1.log" Apr 21 04:18:41.756452 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:41.756420 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7nh4z_12e2811d-0a97-4e4c-a872-22799a21a2c4/kube-rbac-proxy/0.log" Apr 21 04:18:41.776718 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:41.776693 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7nh4z_12e2811d-0a97-4e4c-a872-22799a21a2c4/exporter/0.log" Apr 21 04:18:41.796518 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:41.796491 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7nh4z_12e2811d-0a97-4e4c-a872-22799a21a2c4/extractor/0.log" Apr 21 04:18:44.008542 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:44.008515 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-57f75ff788-sk2b6_06e5a302-848b-4777-8019-47965aa5da8e/manager/0.log" Apr 21 04:18:45.533492 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:45.533468 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dhdf9/perf-node-gather-daemonset-ddhtt" Apr 21 04:18:49.238545 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.238517 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wfmzk_24c30e95-1770-4975-87be-9b1494d8904c/kube-multus-additional-cni-plugins/0.log" Apr 21 04:18:49.261638 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.261612 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wfmzk_24c30e95-1770-4975-87be-9b1494d8904c/egress-router-binary-copy/0.log" Apr 21 04:18:49.281880 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.281859 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wfmzk_24c30e95-1770-4975-87be-9b1494d8904c/cni-plugins/0.log" Apr 21 04:18:49.301966 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.301949 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wfmzk_24c30e95-1770-4975-87be-9b1494d8904c/bond-cni-plugin/0.log" Apr 21 04:18:49.321736 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.321717 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wfmzk_24c30e95-1770-4975-87be-9b1494d8904c/routeoverride-cni/0.log" Apr 21 04:18:49.341397 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.341378 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wfmzk_24c30e95-1770-4975-87be-9b1494d8904c/whereabouts-cni-bincopy/0.log" Apr 21 04:18:49.361064 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.361032 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wfmzk_24c30e95-1770-4975-87be-9b1494d8904c/whereabouts-cni/0.log" Apr 21 04:18:49.588483 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.588414 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lhbpw_64236d25-036a-4831-9f0c-63b1efd05cc1/kube-multus/0.log" Apr 21 04:18:49.678058 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.678031 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cxzzc_03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3/network-metrics-daemon/0.log" Apr 21 04:18:49.695771 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:49.695750 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cxzzc_03e7fd7b-a1ea-4978-8b92-3e0b1c40ada3/kube-rbac-proxy/0.log" Apr 21 04:18:51.147892 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:51.147860 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-controller/0.log" Apr 21 04:18:51.167728 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:51.167697 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/0.log" Apr 21 04:18:51.178088 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:51.178065 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovn-acl-logging/1.log" Apr 21 04:18:51.199508 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:51.199485 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/kube-rbac-proxy-node/0.log" Apr 21 04:18:51.221344 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:51.221321 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 04:18:51.240071 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:51.240052 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/northd/0.log" Apr 21 04:18:51.260139 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:51.260115 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/nbdb/0.log" Apr 21 04:18:51.284785 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:51.284763 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/sbdb/0.log" Apr 21 04:18:51.440197 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:51.440120 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bknrd_7eb134d5-89c6-46e0-ae15-02b2684b117a/ovnkube-controller/0.log" Apr 21 04:18:52.482979 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:52.482950 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-55n56_79dfa283-9e0e-4895-ba2d-3c4af2c1bbd6/check-endpoints/0.log" Apr 21 04:18:52.527662 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:52.527630 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vsvml_97028780-9603-49f8-abdd-0a7e0a1cef8a/network-check-target-container/0.log" Apr 21 04:18:53.557666 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:53.557638 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-zfsw2_229b10ec-c401-4070-945d-fa92e56f6443/iptables-alerter/0.log" Apr 21 04:18:54.267159 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:54.267136 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-k9mlc_33d924f5-c37f-4a3c-96e4-0f8db5a4a7dd/tuned/0.log" Apr 21 04:18:56.161206 ip-10-0-138-120 kubenswrapper[2569]: I0421 04:18:56.161173 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2mhj2_6035bacf-c9f3-42ee-bbb0-49c433954da7/cluster-samples-operator/0.log"