Apr 24 21:27:02.998867 ip-10-0-136-65 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:03.453912 ip-10-0-136-65 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:03.453912 ip-10-0-136-65 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:03.453912 ip-10-0-136-65 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:03.453912 ip-10-0-136-65 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:03.453912 ip-10-0-136-65 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:03.456750 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.456654 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:03.461319 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461303 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.461319 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461320 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461323 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461329 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461333 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461336 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461339 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461342 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461345 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461348 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461351 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461354 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461357 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461360 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461363 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461366 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461369 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461371 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461373 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461376 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.461380 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461379 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461381 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461384 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461387 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461390 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461393 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461396 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461398 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461401 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461403 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461406 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461409 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461411 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461413 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461416 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461419 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461422 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461424 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461427 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.461871 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461429 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461432 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461435 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461437 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461440 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461442 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461445 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461447 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461450 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461452 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461454 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461458 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461462 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461464 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461467 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461472 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461475 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461478 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461480 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.462368 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461483 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461485 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461488 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461491 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461494 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461496 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461499 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461502 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461504 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461507 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461509 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461512 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461514 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461517 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461520 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461523 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461526 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461529 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461532 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461535 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461537 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.462854 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461540 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.463361 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461542 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.463361 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461545 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.463361 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461548 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.463361 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461550 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.463361 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461553 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.463361 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.461555 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463392 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463399 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463403 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463406 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463409 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463412 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463414 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463417 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463419 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463422 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463425 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463427 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463430 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463432 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463435 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463437 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463440 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463442 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463445 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.463510 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463448 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463450 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463453 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463455 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463458 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463460 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463463 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463465 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463467 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463470 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463472 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463475 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463477 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463480 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463484 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463488 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463491 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463493 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463496 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463498 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.464025 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463501 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463503 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463506 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463508 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463511 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463523 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463526 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463529 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463532 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463534 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463537 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463540 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463542 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463545 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463550 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463553 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463556 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463559 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463563 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.464529 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463565 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463580 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463583 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463586 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463588 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463591 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463593 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463596 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463599 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463602 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463604 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463609 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463613 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463615 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463618 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463621 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463623 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463627 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463629 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463632 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.465122 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463634 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463637 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463639 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463642 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463645 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463648 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463650 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.463653 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463730 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463737 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463744 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463749 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463753 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463757 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463761 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463765 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463769 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463772 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463776 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463780 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463783 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463786 2577 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463789 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:03.465635 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463792 2577 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463795 2577 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463798 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463801 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463806 2577 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463809 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463812 2577 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463815 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463818 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463822 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463825 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463829 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463832 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463835 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463838 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463841 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463845 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463848 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463853 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463856 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463859 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463862 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463865 2577 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463868 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463872 2577 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:03.466215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463876 2577 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463878 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463882 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463885 2577 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463889 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463892 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463895 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463898 2577 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463901 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463904 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463907 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463911 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463914 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463917 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463919 2577 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463923 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463926 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463929 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463932 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463935 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463938 2577 flags.go:64] FLAG: --help="false" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463941 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463944 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463947 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:03.466847 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463950 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463959 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463963 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463966 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463969 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463971 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463974 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463977 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463981 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463984 2577 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463987 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463990 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463993 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463996 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.463999 2577 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464002 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464005 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464008 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464013 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464016 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464019 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464022 2577 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464025 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464028 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:03.467425 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464031 2577 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464034 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464039 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464042 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464046 2577 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464050 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464053 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464056 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464059 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464062 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464066 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464069 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464078 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464081 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464084 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464087 2577 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464090 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464096 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464099 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464102 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464105 2577 flags.go:64] FLAG: --port="10250" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464108 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464111 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cad3615edf1b48a0" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464114 2577 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:03.468027 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464117 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464120 2577 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464123 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464126 2577 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464130 2577 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464132 2577 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464135 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464138 2577 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464142 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464145 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464148 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464151 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464154 2577 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464157 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464160 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464163 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464166 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464169 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464172 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464176 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464183 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464186 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464189 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464192 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464195 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464198 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:03.468624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464201 2577 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464204 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464210 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464213 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464216 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464221 2577 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464224 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464226 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464230 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464232 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464235 2577 flags.go:64] FLAG: --v="2" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464240 2577 flags.go:64] FLAG: --version="false" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464244 2577 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464248 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464251 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464343 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464347 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464350 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464353 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464355 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464358 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464361 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464364 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.469252 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464367 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464370 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464373 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464378 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464381 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464384 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464387 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464390 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464392 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464396 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464398 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464401 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464404 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464406 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464409 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464411 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464414 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464416 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464419 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.469838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464421 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464424 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464427 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464429 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464432 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464434 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464437 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464439 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464441 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464444 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464446 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464449 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464451 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464454 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464456 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464459 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464463 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464466 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464469 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464471 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.470710 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464474 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464476 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464479 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464481 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464484 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464487 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464490 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464493 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464496 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464498 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464500 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464503 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464505 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464508 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464510 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464513 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464517 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464520 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464523 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464526 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.471531 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464528 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464531 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464534 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464537 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464540 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464543 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464546 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464549 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464553 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464555 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464558 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464561 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464563 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464566 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464581 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464583 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464586 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464589 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.472402 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.464591 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.473194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.464600 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:03.473671 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.473646 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:03.473737 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.473673 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:03.473821 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473810 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473822 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473828 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473838 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473844 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473848 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473853 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473857 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473862 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473867 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.473870 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473871 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473876 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473881 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473886 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473890 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473895 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473904 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473910 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473914 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473918 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473923 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473927 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473931 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473935 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473940 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473945 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473949 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473953 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473958 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.474310 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473967 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473971 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473975 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473980 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473984 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473989 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473993 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.473998 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474002 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474006 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474011 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474015 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474027 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474035 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474041 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474047 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474052 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474056 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474060 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.475154 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474065 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474069 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474075 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474080 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474085 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474094 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474099 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474103 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474107 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474112 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474117 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474122 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474127 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474131 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474136 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474143 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474150 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474159 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474164 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474168 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.475817 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474173 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474178 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474182 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474186 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474191 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474195 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474199 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474204 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474209 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474213 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474222 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474227 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474232 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474236 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474241 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474246 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474251 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.476426 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474255 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.474264 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474620 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474631 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474637 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474643 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474648 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474655 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474661 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474667 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474672 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474682 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474687 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474692 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474697 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.477048 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474701 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474705 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474710 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474715 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474719 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474723 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474728 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474732 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474737 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474747 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474752 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474756 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474761 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474765 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474769 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474774 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474779 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474784 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474788 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474792 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.477440 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474797 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474801 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474810 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474815 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474819 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474824 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474829 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474833 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474837 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474841 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474846 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474850 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474854 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474858 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474863 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474873 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474878 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474882 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474886 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474890 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.477973 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474896 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474900 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474905 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474910 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474915 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474919 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474924 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474932 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474937 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474943 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474948 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474953 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474958 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474962 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474967 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474971 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474975 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474980 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474984 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474988 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.478456 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.474997 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475001 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475006 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475010 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475014 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475018 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475022 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475029 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475035 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475041 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475045 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475050 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:03.475059 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.475068 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:03.478952 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.475829 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:03.481907 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.481889 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:03.483671 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.483656 2577 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:03.483768 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.483751 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:03.483812 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.483795 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:03.510140 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.510124 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:03.512338 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.512321 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:03.524811 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.524794 2577 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:03.530764 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.530748 2577 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:03.532884 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.532862 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:03.534949 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.534930 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 89412218-7564-4e54-9bfa-94ed586b7fd4:/dev/nvme0n1p3 ad6daa14-d3eb-4b1b-b7d9-153b947c8e75:/dev/nvme0n1p4] Apr 24 21:27:03.535016 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.534949 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:03.539857 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.539738 2577 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:03.538461867 +0000 UTC m=+0.421448060 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100433 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec211b023e57b9e871d98caf7535f302 SystemUUID:ec211b02-3e57-b9e8-71d9-8caf7535f302 BootID:5cbd3fe3-708b-499f-99a5-b4aa1c976426 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fe:1b:5c:8d:5b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fe:1b:5c:8d:5b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1a:a3:be:58:27:2d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:03.539857 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.539845 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:03.539970 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.539929 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:03.540826 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.540806 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:03.540973 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.540827 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-65.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:03.541014 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.540983 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:03.541014 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.540991 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:03.541014 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.541004 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:03.541762 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.541750 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:03.542974 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.542963 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:03.543070 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.543062 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:03.547181 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.547159 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:03.548037 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.548022 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:03.548121 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.548046 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:03.548121 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.548067 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:03.548121 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.548083 2577 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:03.548121 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.548098 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:03.552067 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.552052 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:03.552133 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.552074 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:03.554652 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.554635 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:03.555784 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.555771 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:03.557611 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557598 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:03.557661 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557617 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:03.557661 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557624 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:03.557661 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557630 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:03.557661 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557635 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:03.557661 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557641 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:03.557661 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557647 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:03.557661 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557652 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:03.557661 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557660 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:03.557877 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557667 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:03.557877 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557675 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:03.557877 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.557684 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:03.559434 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.559423 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:03.559434 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.559433 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:03.561962 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.561938 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-65.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:03.562032 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.561966 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:03.562830 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.562817 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:03.562882 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.562855 2577 server.go:1295] "Started kubelet" Apr 24 21:27:03.563606 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.563585 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-65.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:03.563686 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.563636 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:03.563610 ip-10-0-136-65 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:03.563822 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.563697 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:03.564330 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.564305 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:03.565222 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.565201 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:03.565694 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.565675 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:03.569790 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.569766 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:03.570792 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.570777 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:03.571272 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.571256 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:03.572515 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.572495 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:03.572515 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.572518 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:03.572674 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.572553 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:03.572674 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.572643 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:03.572674 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.572652 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:03.572819 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.572721 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:03.573218 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.573199 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:03.573218 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.573219 2577 factory.go:55] Registering systemd factory Apr 24 21:27:03.573348 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.573227 2577 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:03.573460 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.573443 2577 factory.go:153] Registering CRI-O factory Apr 24 21:27:03.573509 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.573465 2577 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:03.574171 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.574155 2577 factory.go:103] Registering Raw factory Apr 24 21:27:03.574260 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.574178 2577 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:03.574586 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.574556 2577 manager.go:319] Starting recovery of all containers Apr 24 21:27:03.574774 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.574725 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-65.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:27:03.574894 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.574877 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:27:03.575920 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.574705 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-65.ec2.internal.18a9682214e0deeb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-65.ec2.internal,UID:ip-10-0-136-65.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-65.ec2.internal,},FirstTimestamp:2026-04-24 21:27:03.562829547 +0000 UTC m=+0.445815739,LastTimestamp:2026-04-24 21:27:03.562829547 +0000 UTC m=+0.445815739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-65.ec2.internal,}" Apr 24 21:27:03.586287 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.586176 2577 manager.go:324] Recovery completed Apr 24 21:27:03.587475 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.587458 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 21:27:03.590526 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.590514 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.592855 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.592787 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.592855 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.592819 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.592855 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.592832 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.593362 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.593348 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:03.593362 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.593359 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:03.593443 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.593374 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:03.594936 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.594920 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9k98x" Apr 24 21:27:03.595056 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.594996 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-65.ec2.internal.18a9682216aa3c11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-65.ec2.internal,UID:ip-10-0-136-65.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-65.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-65.ec2.internal,},FirstTimestamp:2026-04-24 21:27:03.592803345 +0000 UTC m=+0.475789543,LastTimestamp:2026-04-24 21:27:03.592803345 +0000 UTC m=+0.475789543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-65.ec2.internal,}" Apr 24 21:27:03.596311 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.596297 2577 policy_none.go:49] "None policy: Start" Apr 24 21:27:03.596311 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.596314 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:03.596403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.596325 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:03.603556 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.603490 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-65.ec2.internal.18a9682216aa8e9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-65.ec2.internal,UID:ip-10-0-136-65.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-136-65.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-136-65.ec2.internal,},FirstTimestamp:2026-04-24 21:27:03.592824474 +0000 UTC m=+0.475810671,LastTimestamp:2026-04-24 21:27:03.592824474 +0000 UTC m=+0.475810671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-65.ec2.internal,}" Apr 24 21:27:03.603804 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.603783 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9k98x" Apr 24 21:27:03.633137 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.633123 2577 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:03.651444 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.633155 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:03.651444 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.633165 2577 server.go:85] "Starting device plugin registration server" Apr 24 21:27:03.651444 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.633354 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:03.651444 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.633367 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:03.651444 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.633484 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:03.651444 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.633585 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:03.651444 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.633595 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:03.651444 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.634051 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:03.651444 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.634091 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:03.695963 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.695919 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:03.697139 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.697125 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:03.697204 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.697151 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:03.697204 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.697168 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:03.697204 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.697175 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:03.697339 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.697212 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:03.700016 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.699999 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:03.733506 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.733460 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.734475 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.734461 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.734538 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.734490 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.734538 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.734499 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.734538 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.734521 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.744394 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.744375 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.744445 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.744398 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-65.ec2.internal\": node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:03.783696 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.783678 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:03.797967 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.797948 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal"] Apr 24 21:27:03.798016 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.798007 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.799463 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.799449 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.799542 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.799475 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.799542 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.799486 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.801705 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.801693 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.801840 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.801826 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.801877 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.801856 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.802337 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.802323 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.802412 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.802349 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.802412 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.802377 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.802412 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.802405 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.802517 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.802425 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.802517 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.802439 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.804588 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.804559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.804655 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.804599 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.805197 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.805184 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.805255 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.805217 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.805255 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.805230 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.828177 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.828158 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-65.ec2.internal\" not found" node="ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.832561 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.832548 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-65.ec2.internal\" not found" node="ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.874243 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.874222 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7f1b6e2046d6b808fbf08cbfde4e508e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal\" (UID: \"7f1b6e2046d6b808fbf08cbfde4e508e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.874308 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.874256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f1b6e2046d6b808fbf08cbfde4e508e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal\" (UID: \"7f1b6e2046d6b808fbf08cbfde4e508e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.874308 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.874277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9fc64b0f94b5e05651be58c19d8e03a7-config\") pod \"kube-apiserver-proxy-ip-10-0-136-65.ec2.internal\" (UID: \"9fc64b0f94b5e05651be58c19d8e03a7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.884317 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.884287 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:03.974488 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.974464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f1b6e2046d6b808fbf08cbfde4e508e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal\" (UID: \"7f1b6e2046d6b808fbf08cbfde4e508e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.974488 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.974491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9fc64b0f94b5e05651be58c19d8e03a7-config\") pod \"kube-apiserver-proxy-ip-10-0-136-65.ec2.internal\" (UID: \"9fc64b0f94b5e05651be58c19d8e03a7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.974656 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.974508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7f1b6e2046d6b808fbf08cbfde4e508e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal\" (UID: \"7f1b6e2046d6b808fbf08cbfde4e508e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.974656 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.974534 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7f1b6e2046d6b808fbf08cbfde4e508e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal\" (UID: \"7f1b6e2046d6b808fbf08cbfde4e508e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.974656 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.974557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f1b6e2046d6b808fbf08cbfde4e508e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal\" (UID: \"7f1b6e2046d6b808fbf08cbfde4e508e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.974656 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:03.974627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9fc64b0f94b5e05651be58c19d8e03a7-config\") pod \"kube-apiserver-proxy-ip-10-0-136-65.ec2.internal\" (UID: \"9fc64b0f94b5e05651be58c19d8e03a7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal" Apr 24 21:27:03.984611 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:03.984551 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.085360 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.085334 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.130536 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.130505 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" Apr 24 21:27:04.135107 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.135090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal" Apr 24 21:27:04.185906 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.185877 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.286499 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.286425 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.387034 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.386994 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.458063 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.458037 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:04.483820 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.483800 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:04.483934 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.483915 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:04.483970 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.483948 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:04.487951 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.487934 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.505712 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.505693 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:04.571708 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.571684 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:04.584211 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.584189 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:04.588491 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.588475 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.605216 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.605198 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wd6dq" Apr 24 21:27:04.606256 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.606231 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:03 +0000 UTC" deadline="2027-11-22 11:12:38.706559566 +0000 UTC" Apr 24 21:27:04.606308 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.606256 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13837h45m34.100306434s" Apr 24 21:27:04.614536 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.614520 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wd6dq" Apr 24 21:27:04.688776 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.688748 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.733804 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:04.733764 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc64b0f94b5e05651be58c19d8e03a7.slice/crio-a6e405ad6d4a5fb2057ea28b35327035c3b974c04f25408d7ab902ba86ef9344 WatchSource:0}: Error finding container a6e405ad6d4a5fb2057ea28b35327035c3b974c04f25408d7ab902ba86ef9344: Status 404 returned error can't find the container with id a6e405ad6d4a5fb2057ea28b35327035c3b974c04f25408d7ab902ba86ef9344 Apr 24 21:27:04.734537 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:04.734513 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f1b6e2046d6b808fbf08cbfde4e508e.slice/crio-7c3f80f70a8701d2955cfef991a122e198a15dda9b6074d3a30de744a885513a WatchSource:0}: Error finding container 7c3f80f70a8701d2955cfef991a122e198a15dda9b6074d3a30de744a885513a: Status 404 returned error can't find the container with id 7c3f80f70a8701d2955cfef991a122e198a15dda9b6074d3a30de744a885513a Apr 24 21:27:04.738761 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:04.738742 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:04.789775 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.789749 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.890326 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.890271 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:04.990840 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:04.990811 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-65.ec2.internal\" not found" Apr 24 21:27:05.071292 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.071264 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:05.071702 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.071509 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" Apr 24 21:27:05.090084 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.090059 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:05.091303 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.091277 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal" Apr 24 21:27:05.101105 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.101085 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:05.550171 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.550092 2577 apiserver.go:52] "Watching apiserver" Apr 24 21:27:05.560728 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.560701 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:05.561156 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.561133 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-672zj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal","openshift-multus/multus-additional-cni-plugins-pvrrx","openshift-network-diagnostics/network-check-target-g2stj","openshift-ovn-kubernetes/ovnkube-node-lvpcz","kube-system/konnectivity-agent-7mlvd","kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld","openshift-dns/node-resolver-s74k7","openshift-multus/multus-wl7mb","openshift-multus/network-metrics-daemon-r5pbt","openshift-network-operator/iptables-alerter-84qsj","openshift-cluster-node-tuning-operator/tuned-dc7wx"] Apr 24 21:27:05.564299 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.564275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.566445 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.566422 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.566887 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.566734 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:05.566887 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.566777 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-57sgt\"" Apr 24 21:27:05.567061 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.567012 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.567061 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.567041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.567160 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.567070 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:05.568917 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.568741 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:05.568917 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.568764 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:05.568917 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.568791 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wtq92\"" Apr 24 21:27:05.570894 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.570872 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:05.571002 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.570940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:05.571002 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:05.570961 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:05.573208 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.573185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.573611 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.573443 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:05.573685 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.573660 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:05.573788 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.573766 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-m5pdd\"" Apr 24 21:27:05.575535 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.575490 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.575673 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.575656 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5jvrh\"" Apr 24 21:27:05.575762 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.575671 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.575915 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.575897 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:05.576015 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.575999 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.577704 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.577674 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.577831 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.577797 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zq2fn\"" Apr 24 21:27:05.578134 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.578114 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.578215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.578146 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:05.578434 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.578415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:05.578516 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:05.578491 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:05.580826 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.580810 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.582058 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44478f44-ad28-4f73-9fd4-429d584502ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.582155 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cgl5\" (UniqueName: \"kubernetes.io/projected/44478f44-ad28-4f73-9fd4-429d584502ef-kube-api-access-6cgl5\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.582155 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582099 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/94182b7e-3409-484c-82ea-df615ef6141e-agent-certs\") pod \"konnectivity-agent-7mlvd\" (UID: \"94182b7e-3409-484c-82ea-df615ef6141e\") " pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:05.582155 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5617082-ad48-4271-8c23-19c149807eba-host\") pod \"node-ca-672zj\" (UID: \"d5617082-ad48-4271-8c23-19c149807eba\") " pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.582155 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-os-release\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.582353 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-etc-selinux\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.582353 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-system-cni-dir\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.582353 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582246 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6849759-b993-4293-a216-c7f7861f1c3f-cni-binary-copy\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.582353 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44478f44-ad28-4f73-9fd4-429d584502ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.582353 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582296 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7h56\" (UniqueName: \"kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56\") pod \"network-check-target-g2stj\" (UID: \"0c5854d5-4980-4604-9aa1-a757c380c0da\") " pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:05.582542 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvbb\" (UniqueName: \"kubernetes.io/projected/3ea34c37-0aa8-49fd-94af-4f419a1c9130-kube-api-access-kfvbb\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.582542 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-run-multus-certs\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.582542 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-registration-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.582542 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582514 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-device-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.582542 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-sys-fs\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.582763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5617082-ad48-4271-8c23-19c149807eba-serviceca\") pod \"node-ca-672zj\" (UID: \"d5617082-ad48-4271-8c23-19c149807eba\") " pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.582763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-multus-socket-dir-parent\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.582763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582613 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-var-lib-kubelet\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.582763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhznf\" (UniqueName: \"kubernetes.io/projected/d5617082-ad48-4271-8c23-19c149807eba-kube-api-access-vhznf\") pod \"node-ca-672zj\" (UID: \"d5617082-ad48-4271-8c23-19c149807eba\") " pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.582763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582641 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-multus-cni-dir\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.582763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582662 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-multus-conf-dir\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.582763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582698 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjz5\" (UniqueName: \"kubernetes.io/projected/f6849759-b993-4293-a216-c7f7861f1c3f-kube-api-access-kpjz5\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.582763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582732 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-system-cni-dir\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.582763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/94182b7e-3409-484c-82ea-df615ef6141e-konnectivity-ca\") pod \"konnectivity-agent-7mlvd\" (UID: \"94182b7e-3409-484c-82ea-df615ef6141e\") " pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582821 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-socket-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-run-netns\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582893 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-var-lib-cni-bin\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6849759-b993-4293-a216-c7f7861f1c3f-multus-daemon-config\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.582988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-os-release\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/44478f44-ad28-4f73-9fd4-429d584502ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-cnibin\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583050 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-run-k8s-cni-cncf-io\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-var-lib-cni-multus\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583094 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-etc-kubernetes\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-hostroot\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.583153 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-cnibin\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.583747 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583180 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.583747 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583199 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.583747 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583669 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2ld2q\"" Apr 24 21:27:05.583907 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.583890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:05.584998 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.584499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.588035 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.587996 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.588453 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.588419 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:05.589544 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.589341 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:05.589544 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.589348 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.589544 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.589369 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:05.589750 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.589670 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.589750 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.589720 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-m4n82\"" Apr 24 21:27:05.589845 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.589813 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:05.590283 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.590249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.590409 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.590327 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wkwzl\"" Apr 24 21:27:05.590409 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.590399 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.590705 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.590687 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.594275 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.594254 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.594545 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.594525 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.594772 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.594755 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v5lqg\"" Apr 24 21:27:05.615215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.615183 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:04 +0000 UTC" deadline="2028-01-27 10:36:03.72399085 +0000 UTC" Apr 24 21:27:05.615215 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.615213 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15421h8m58.108781885s" Apr 24 21:27:05.674110 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.674086 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:05.683303 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-system-cni-dir\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.683423 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6849759-b993-4293-a216-c7f7861f1c3f-cni-binary-copy\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.683423 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44478f44-ad28-4f73-9fd4-429d584502ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.683423 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-sysconfig\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.683423 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-system-cni-dir\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.683672 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8cs\" (UniqueName: \"kubernetes.io/projected/c88b1b60-c919-439f-810d-ad2b2ecf4811-kube-api-access-5p8cs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:05.683672 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-run-ovn-kubernetes\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.683672 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfvbb\" (UniqueName: \"kubernetes.io/projected/3ea34c37-0aa8-49fd-94af-4f419a1c9130-kube-api-access-kfvbb\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.683672 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/23aa9667-569b-4627-bc38-54b145825a25-hosts-file\") pod \"node-resolver-s74k7\" (UID: \"23aa9667-569b-4627-bc38-54b145825a25\") " pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.683672 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-kubelet\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.683672 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683587 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-run-systemd\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.683672 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e9546e8-52dd-4b70-a206-29bd990eb383-env-overrides\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.683672 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e9546e8-52dd-4b70-a206-29bd990eb383-ovn-node-metrics-cert\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.684049 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-device-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.684049 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5617082-ad48-4271-8c23-19c149807eba-serviceca\") pod \"node-ca-672zj\" (UID: \"d5617082-ad48-4271-8c23-19c149807eba\") " pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.684049 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-multus-conf-dir\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.684049 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.684049 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/94182b7e-3409-484c-82ea-df615ef6141e-konnectivity-ca\") pod \"konnectivity-agent-7mlvd\" (UID: \"94182b7e-3409-484c-82ea-df615ef6141e\") " pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:05.684049 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-cni-bin\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.684049 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.683989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44478f44-ad28-4f73-9fd4-429d584502ef-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.684420 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684158 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.684420 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684353 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-multus-conf-dir\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.684420 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6849759-b993-4293-a216-c7f7861f1c3f-cni-binary-copy\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.684567 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/94182b7e-3409-484c-82ea-df615ef6141e-konnectivity-ca\") pod \"konnectivity-agent-7mlvd\" (UID: \"94182b7e-3409-484c-82ea-df615ef6141e\") " pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:05.684628 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5617082-ad48-4271-8c23-19c149807eba-serviceca\") pod \"node-ca-672zj\" (UID: \"d5617082-ad48-4271-8c23-19c149807eba\") " pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.684696 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-device-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.684746 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhznf\" (UniqueName: \"kubernetes.io/projected/d5617082-ad48-4271-8c23-19c149807eba-kube-api-access-vhznf\") pod \"node-ca-672zj\" (UID: \"d5617082-ad48-4271-8c23-19c149807eba\") " pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.684786 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-multus-cni-dir\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.684835 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684819 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjz5\" (UniqueName: \"kubernetes.io/projected/f6849759-b993-4293-a216-c7f7861f1c3f-kube-api-access-kpjz5\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.684878 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-system-cni-dir\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.684931 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-multus-cni-dir\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.684931 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1c909a7-04ae-48f7-903a-e294140cc67e-tmp\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.685057 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-systemd-units\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.685057 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.684966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e9546e8-52dd-4b70-a206-29bd990eb383-ovnkube-script-lib\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.685057 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-system-cni-dir\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.685057 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-socket-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.685230 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-run-netns\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.685230 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-socket-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.685230 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6849759-b993-4293-a216-c7f7861f1c3f-multus-daemon-config\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.685230 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/44478f44-ad28-4f73-9fd4-429d584502ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.685230 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685217 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9j9c\" (UniqueName: \"kubernetes.io/projected/23aa9667-569b-4627-bc38-54b145825a25-kube-api-access-k9j9c\") pod \"node-resolver-s74k7\" (UID: \"23aa9667-569b-4627-bc38-54b145825a25\") " pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.685230 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-run-netns\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-cnibin\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-run-k8s-cni-cncf-io\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-run-ovn\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-cni-netd\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-run-k8s-cni-cncf-io\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-cnibin\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhp8d\" (UniqueName: \"kubernetes.io/projected/8e9546e8-52dd-4b70-a206-29bd990eb383-kube-api-access-bhp8d\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-cnibin\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.685557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44478f44-ad28-4f73-9fd4-429d584502ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-cnibin\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cgl5\" (UniqueName: \"kubernetes.io/projected/44478f44-ad28-4f73-9fd4-429d584502ef-kube-api-access-6cgl5\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685656 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ft27\" (UniqueName: \"kubernetes.io/projected/d1c909a7-04ae-48f7-903a-e294140cc67e-kube-api-access-4ft27\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685670 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6849759-b993-4293-a216-c7f7861f1c3f-multus-daemon-config\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685675 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/44478f44-ad28-4f73-9fd4-429d584502ef-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-log-socket\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5617082-ad48-4271-8c23-19c149807eba-host\") pod \"node-ca-672zj\" (UID: \"d5617082-ad48-4271-8c23-19c149807eba\") " pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c85c2f4a-3310-4862-b16b-7dd95f352625-iptables-alerter-script\") pod \"iptables-alerter-84qsj\" (UID: \"c85c2f4a-3310-4862-b16b-7dd95f352625\") " pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685786 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-sysctl-conf\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-var-lib-openvswitch\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-node-log\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685840 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5617082-ad48-4271-8c23-19c149807eba-host\") pod \"node-ca-672zj\" (UID: \"d5617082-ad48-4271-8c23-19c149807eba\") " pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-etc-selinux\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h56\" (UniqueName: \"kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56\") pod \"network-check-target-g2stj\" (UID: \"0c5854d5-4980-4604-9aa1-a757c380c0da\") " pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685917 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-systemd\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-run-multus-certs\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.686126 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-lib-modules\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.685997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-registration-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-etc-selinux\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-run-multus-certs\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-sys-fs\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-multus-socket-dir-parent\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-registration-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-var-lib-kubelet\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-sys-fs\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-multus-socket-dir-parent\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-modprobe-d\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-sysctl-d\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-var-lib-kubelet\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e9546e8-52dd-4b70-a206-29bd990eb383-ovnkube-config\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/23aa9667-569b-4627-bc38-54b145825a25-tmp-dir\") pod \"node-resolver-s74k7\" (UID: \"23aa9667-569b-4627-bc38-54b145825a25\") " pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-kubernetes\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.686904 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-var-lib-cni-bin\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-os-release\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-host\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686380 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-run-netns\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-var-lib-cni-bin\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-var-lib-cni-multus\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44478f44-ad28-4f73-9fd4-429d584502ef-os-release\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-etc-kubernetes\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686484 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-host-var-lib-cni-multus\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-var-lib-kubelet\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-tuned\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-etc-kubernetes\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-hostroot\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/94182b7e-3409-484c-82ea-df615ef6141e-agent-certs\") pod \"konnectivity-agent-7mlvd\" (UID: \"94182b7e-3409-484c-82ea-df615ef6141e\") " pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44478f44-ad28-4f73-9fd4-429d584502ef-cni-binary-copy\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686613 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-hostroot\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.687541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-run\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-slash\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-run-openvswitch\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-os-release\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c85c2f4a-3310-4862-b16b-7dd95f352625-host-slash\") pod \"iptables-alerter-84qsj\" (UID: \"c85c2f4a-3310-4862-b16b-7dd95f352625\") " pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ea34c37-0aa8-49fd-94af-4f419a1c9130-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686895 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxtmz\" (UniqueName: \"kubernetes.io/projected/c85c2f4a-3310-4862-b16b-7dd95f352625-kube-api-access-lxtmz\") pod \"iptables-alerter-84qsj\" (UID: \"c85c2f4a-3310-4862-b16b-7dd95f352625\") " pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6849759-b993-4293-a216-c7f7861f1c3f-os-release\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-sys\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.686949 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-etc-openvswitch\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.688195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.687022 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:05.690426 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.690407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/94182b7e-3409-484c-82ea-df615ef6141e-agent-certs\") pod \"konnectivity-agent-7mlvd\" (UID: \"94182b7e-3409-484c-82ea-df615ef6141e\") " pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:05.692305 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:05.692284 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:05.692423 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:05.692312 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:05.692423 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:05.692326 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s7h56 for pod openshift-network-diagnostics/network-check-target-g2stj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:05.692423 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:05.692411 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56 podName:0c5854d5-4980-4604-9aa1-a757c380c0da nodeName:}" failed. No retries permitted until 2026-04-24 21:27:06.192379187 +0000 UTC m=+3.075365385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s7h56" (UniqueName: "kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56") pod "network-check-target-g2stj" (UID: "0c5854d5-4980-4604-9aa1-a757c380c0da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:05.693112 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.693053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfvbb\" (UniqueName: \"kubernetes.io/projected/3ea34c37-0aa8-49fd-94af-4f419a1c9130-kube-api-access-kfvbb\") pod \"aws-ebs-csi-driver-node-4tjld\" (UID: \"3ea34c37-0aa8-49fd-94af-4f419a1c9130\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.693469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.693431 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhznf\" (UniqueName: \"kubernetes.io/projected/d5617082-ad48-4271-8c23-19c149807eba-kube-api-access-vhznf\") pod \"node-ca-672zj\" (UID: \"d5617082-ad48-4271-8c23-19c149807eba\") " pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.694131 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.694111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjz5\" (UniqueName: \"kubernetes.io/projected/f6849759-b993-4293-a216-c7f7861f1c3f-kube-api-access-kpjz5\") pod \"multus-wl7mb\" (UID: \"f6849759-b993-4293-a216-c7f7861f1c3f\") " pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.694980 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.694958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cgl5\" (UniqueName: \"kubernetes.io/projected/44478f44-ad28-4f73-9fd4-429d584502ef-kube-api-access-6cgl5\") pod \"multus-additional-cni-plugins-pvrrx\" (UID: \"44478f44-ad28-4f73-9fd4-429d584502ef\") " pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.704642 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.704596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" event={"ID":"7f1b6e2046d6b808fbf08cbfde4e508e","Type":"ContainerStarted","Data":"7c3f80f70a8701d2955cfef991a122e198a15dda9b6074d3a30de744a885513a"} Apr 24 21:27:05.705617 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.705591 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal" event={"ID":"9fc64b0f94b5e05651be58c19d8e03a7","Type":"ContainerStarted","Data":"a6e405ad6d4a5fb2057ea28b35327035c3b974c04f25408d7ab902ba86ef9344"} Apr 24 21:27:05.787230 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-host\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-run-netns\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-var-lib-kubelet\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-tuned\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787320 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-host\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787320 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-run-netns\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-var-lib-kubelet\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-run\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-slash\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.787403 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:05.787401 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-run-openvswitch\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-slash\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-run\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787458 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-run-openvswitch\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:05.787462 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs podName:c88b1b60-c919-439f-810d-ad2b2ecf4811 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:06.287442612 +0000 UTC m=+3.170428794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs") pod "network-metrics-daemon-r5pbt" (UID: "c88b1b60-c919-439f-810d-ad2b2ecf4811") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c85c2f4a-3310-4862-b16b-7dd95f352625-host-slash\") pod \"iptables-alerter-84qsj\" (UID: \"c85c2f4a-3310-4862-b16b-7dd95f352625\") " pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxtmz\" (UniqueName: \"kubernetes.io/projected/c85c2f4a-3310-4862-b16b-7dd95f352625-kube-api-access-lxtmz\") pod \"iptables-alerter-84qsj\" (UID: \"c85c2f4a-3310-4862-b16b-7dd95f352625\") " pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-sys\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787559 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c85c2f4a-3310-4862-b16b-7dd95f352625-host-slash\") pod \"iptables-alerter-84qsj\" (UID: \"c85c2f4a-3310-4862-b16b-7dd95f352625\") " pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-sys\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787684 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-etc-openvswitch\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-sysconfig\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8cs\" (UniqueName: \"kubernetes.io/projected/c88b1b60-c919-439f-810d-ad2b2ecf4811-kube-api-access-5p8cs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-run-ovn-kubernetes\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787786 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-etc-openvswitch\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/23aa9667-569b-4627-bc38-54b145825a25-hosts-file\") pod \"node-resolver-s74k7\" (UID: \"23aa9667-569b-4627-bc38-54b145825a25\") " pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.787956 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-sysconfig\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-kubelet\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-run-ovn-kubernetes\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-run-systemd\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-kubelet\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e9546e8-52dd-4b70-a206-29bd990eb383-env-overrides\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e9546e8-52dd-4b70-a206-29bd990eb383-ovn-node-metrics-cert\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-run-systemd\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.787917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/23aa9667-569b-4627-bc38-54b145825a25-hosts-file\") pod \"node-resolver-s74k7\" (UID: \"23aa9667-569b-4627-bc38-54b145825a25\") " pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-cni-bin\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1c909a7-04ae-48f7-903a-e294140cc67e-tmp\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-systemd-units\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e9546e8-52dd-4b70-a206-29bd990eb383-ovnkube-script-lib\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9j9c\" (UniqueName: \"kubernetes.io/projected/23aa9667-569b-4627-bc38-54b145825a25-kube-api-access-k9j9c\") pod \"node-resolver-s74k7\" (UID: \"23aa9667-569b-4627-bc38-54b145825a25\") " pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-run-ovn\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-systemd-units\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788146 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-cni-netd\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.788726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhp8d\" (UniqueName: \"kubernetes.io/projected/8e9546e8-52dd-4b70-a206-29bd990eb383-kube-api-access-bhp8d\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ft27\" (UniqueName: \"kubernetes.io/projected/d1c909a7-04ae-48f7-903a-e294140cc67e-kube-api-access-4ft27\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-log-socket\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c85c2f4a-3310-4862-b16b-7dd95f352625-iptables-alerter-script\") pod \"iptables-alerter-84qsj\" (UID: \"c85c2f4a-3310-4862-b16b-7dd95f352625\") " pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-sysctl-conf\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e9546e8-52dd-4b70-a206-29bd990eb383-env-overrides\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-var-lib-openvswitch\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-node-log\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-log-socket\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-node-log\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-systemd\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-lib-modules\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-modprobe-d\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-sysctl-d\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e9546e8-52dd-4b70-a206-29bd990eb383-ovnkube-config\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-cni-bin\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.789469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/23aa9667-569b-4627-bc38-54b145825a25-tmp-dir\") pod \"node-resolver-s74k7\" (UID: \"23aa9667-569b-4627-bc38-54b145825a25\") " pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-kubernetes\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788720 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-var-lib-openvswitch\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-systemd\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-sysctl-conf\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788900 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e9546e8-52dd-4b70-a206-29bd990eb383-ovnkube-script-lib\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-cni-netd\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c85c2f4a-3310-4862-b16b-7dd95f352625-iptables-alerter-script\") pod \"iptables-alerter-84qsj\" (UID: \"c85c2f4a-3310-4862-b16b-7dd95f352625\") " pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788959 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-run-ovn\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.788961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/23aa9667-569b-4627-bc38-54b145825a25-tmp-dir\") pod \"node-resolver-s74k7\" (UID: \"23aa9667-569b-4627-bc38-54b145825a25\") " pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.789040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-sysctl-d\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.789047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-modprobe-d\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.789053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e9546e8-52dd-4b70-a206-29bd990eb383-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.789087 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-kubernetes\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.789118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e9546e8-52dd-4b70-a206-29bd990eb383-ovnkube-config\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.789124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1c909a7-04ae-48f7-903a-e294140cc67e-lib-modules\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.790194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.789808 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d1c909a7-04ae-48f7-903a-e294140cc67e-etc-tuned\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.790833 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.790381 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d1c909a7-04ae-48f7-903a-e294140cc67e-tmp\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.790918 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.790896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e9546e8-52dd-4b70-a206-29bd990eb383-ovn-node-metrics-cert\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.796887 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.796861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxtmz\" (UniqueName: \"kubernetes.io/projected/c85c2f4a-3310-4862-b16b-7dd95f352625-kube-api-access-lxtmz\") pod \"iptables-alerter-84qsj\" (UID: \"c85c2f4a-3310-4862-b16b-7dd95f352625\") " pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.797787 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.797727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9j9c\" (UniqueName: \"kubernetes.io/projected/23aa9667-569b-4627-bc38-54b145825a25-kube-api-access-k9j9c\") pod \"node-resolver-s74k7\" (UID: \"23aa9667-569b-4627-bc38-54b145825a25\") " pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.797787 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.797735 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8cs\" (UniqueName: \"kubernetes.io/projected/c88b1b60-c919-439f-810d-ad2b2ecf4811-kube-api-access-5p8cs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:05.798211 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.798186 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhp8d\" (UniqueName: \"kubernetes.io/projected/8e9546e8-52dd-4b70-a206-29bd990eb383-kube-api-access-bhp8d\") pod \"ovnkube-node-lvpcz\" (UID: \"8e9546e8-52dd-4b70-a206-29bd990eb383\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.798500 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.798481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ft27\" (UniqueName: \"kubernetes.io/projected/d1c909a7-04ae-48f7-903a-e294140cc67e-kube-api-access-4ft27\") pod \"tuned-dc7wx\" (UID: \"d1c909a7-04ae-48f7-903a-e294140cc67e\") " pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:05.876590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.876513 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wl7mb" Apr 24 21:27:05.882319 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.882288 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" Apr 24 21:27:05.893073 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.893052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:05.898652 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.898629 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" Apr 24 21:27:05.905207 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.905190 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-672zj" Apr 24 21:27:05.912332 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.912312 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-84qsj" Apr 24 21:27:05.918952 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.918934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:05.925499 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.925482 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s74k7" Apr 24 21:27:05.930056 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:05.930040 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" Apr 24 21:27:06.026165 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.026130 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:06.192545 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.192459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h56\" (UniqueName: \"kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56\") pod \"network-check-target-g2stj\" (UID: \"0c5854d5-4980-4604-9aa1-a757c380c0da\") " pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:06.192716 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:06.192623 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:06.192716 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:06.192644 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:06.192716 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:06.192656 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s7h56 for pod openshift-network-diagnostics/network-check-target-g2stj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:06.192867 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:06.192721 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56 podName:0c5854d5-4980-4604-9aa1-a757c380c0da nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.192702403 +0000 UTC m=+4.075688587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7h56" (UniqueName: "kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56") pod "network-check-target-g2stj" (UID: "0c5854d5-4980-4604-9aa1-a757c380c0da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:06.293470 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.293435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:06.293636 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:06.293601 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:06.293706 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:06.293677 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs podName:c88b1b60-c919-439f-810d-ad2b2ecf4811 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.293654502 +0000 UTC m=+4.176640685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs") pod "network-metrics-daemon-r5pbt" (UID: "c88b1b60-c919-439f-810d-ad2b2ecf4811") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:06.363545 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:06.363516 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5617082_ad48_4271_8c23_19c149807eba.slice/crio-cac6c92cf9dccc825e537766805f1a724563a1998eaacb631a7fb2952fd9db14 WatchSource:0}: Error finding container cac6c92cf9dccc825e537766805f1a724563a1998eaacb631a7fb2952fd9db14: Status 404 returned error can't find the container with id cac6c92cf9dccc825e537766805f1a724563a1998eaacb631a7fb2952fd9db14 Apr 24 21:27:06.364505 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:06.364483 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44478f44_ad28_4f73_9fd4_429d584502ef.slice/crio-30398492d8ba44fd3a58b6ce6b0cc137d3c24b15298733a1a8515aa2e5869436 WatchSource:0}: Error finding container 30398492d8ba44fd3a58b6ce6b0cc137d3c24b15298733a1a8515aa2e5869436: Status 404 returned error can't find the container with id 30398492d8ba44fd3a58b6ce6b0cc137d3c24b15298733a1a8515aa2e5869436 Apr 24 21:27:06.365629 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:06.365565 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94182b7e_3409_484c_82ea_df615ef6141e.slice/crio-c647663116c31baa0ef56aa5b665fe3d85f35403f2960b4f49114345c26897c6 WatchSource:0}: Error finding container c647663116c31baa0ef56aa5b665fe3d85f35403f2960b4f49114345c26897c6: Status 404 returned error can't find the container with id c647663116c31baa0ef56aa5b665fe3d85f35403f2960b4f49114345c26897c6 Apr 24 21:27:06.366463 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:06.366441 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e9546e8_52dd_4b70_a206_29bd990eb383.slice/crio-74bfe7cec7e62a7c120f07a9eaeb08bd465a1bd2a07097d0e65891b3dc4b42a1 WatchSource:0}: Error finding container 74bfe7cec7e62a7c120f07a9eaeb08bd465a1bd2a07097d0e65891b3dc4b42a1: Status 404 returned error can't find the container with id 74bfe7cec7e62a7c120f07a9eaeb08bd465a1bd2a07097d0e65891b3dc4b42a1 Apr 24 21:27:06.369559 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:06.369536 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc85c2f4a_3310_4862_b16b_7dd95f352625.slice/crio-c3f260ef90c5dee6bc1efdc41d8017169aa6940d7e84d1b79dc244614efd8ed6 WatchSource:0}: Error finding container c3f260ef90c5dee6bc1efdc41d8017169aa6940d7e84d1b79dc244614efd8ed6: Status 404 returned error can't find the container with id c3f260ef90c5dee6bc1efdc41d8017169aa6940d7e84d1b79dc244614efd8ed6 Apr 24 21:27:06.390140 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:06.390119 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23aa9667_569b_4627_bc38_54b145825a25.slice/crio-57b80d7daf8c35e3a0dbbb13ba891174ce5f135efa8a7e80f6f49fafff804b7f WatchSource:0}: Error finding container 57b80d7daf8c35e3a0dbbb13ba891174ce5f135efa8a7e80f6f49fafff804b7f: Status 404 returned error can't find the container with id 57b80d7daf8c35e3a0dbbb13ba891174ce5f135efa8a7e80f6f49fafff804b7f Apr 24 21:27:06.390838 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:06.390719 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6849759_b993_4293_a216_c7f7861f1c3f.slice/crio-ca1ac0268ffbf7656500d54518b6df63c17d795922fed71460b13d82341e4be1 WatchSource:0}: Error finding container ca1ac0268ffbf7656500d54518b6df63c17d795922fed71460b13d82341e4be1: Status 404 returned error can't find the container with id ca1ac0268ffbf7656500d54518b6df63c17d795922fed71460b13d82341e4be1 Apr 24 21:27:06.391439 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:06.391422 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1c909a7_04ae_48f7_903a_e294140cc67e.slice/crio-630a805825fd930de462702d5efeb81a4f0230d135682b853bb6f4c2931b6c5b WatchSource:0}: Error finding container 630a805825fd930de462702d5efeb81a4f0230d135682b853bb6f4c2931b6c5b: Status 404 returned error can't find the container with id 630a805825fd930de462702d5efeb81a4f0230d135682b853bb6f4c2931b6c5b Apr 24 21:27:06.393209 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:06.393193 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ea34c37_0aa8_49fd_94af_4f419a1c9130.slice/crio-5f3c8b9fbeaba71ae1509b347e4d1e422e3c07ecd9c47ff9c338583a00993396 WatchSource:0}: Error finding container 5f3c8b9fbeaba71ae1509b347e4d1e422e3c07ecd9c47ff9c338583a00993396: Status 404 returned error can't find the container with id 5f3c8b9fbeaba71ae1509b347e4d1e422e3c07ecd9c47ff9c338583a00993396 Apr 24 21:27:06.615900 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.615724 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:04 +0000 UTC" deadline="2028-01-04 11:36:34.81933363 +0000 UTC" Apr 24 21:27:06.615900 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.615894 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14870h9m28.20344223s" Apr 24 21:27:06.710479 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.709711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" event={"ID":"8e9546e8-52dd-4b70-a206-29bd990eb383","Type":"ContainerStarted","Data":"74bfe7cec7e62a7c120f07a9eaeb08bd465a1bd2a07097d0e65891b3dc4b42a1"} Apr 24 21:27:06.715561 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.715472 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" event={"ID":"44478f44-ad28-4f73-9fd4-429d584502ef","Type":"ContainerStarted","Data":"30398492d8ba44fd3a58b6ce6b0cc137d3c24b15298733a1a8515aa2e5869436"} Apr 24 21:27:06.724303 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.724247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-672zj" event={"ID":"d5617082-ad48-4271-8c23-19c149807eba","Type":"ContainerStarted","Data":"cac6c92cf9dccc825e537766805f1a724563a1998eaacb631a7fb2952fd9db14"} Apr 24 21:27:06.729625 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.729596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" event={"ID":"3ea34c37-0aa8-49fd-94af-4f419a1c9130","Type":"ContainerStarted","Data":"5f3c8b9fbeaba71ae1509b347e4d1e422e3c07ecd9c47ff9c338583a00993396"} Apr 24 21:27:06.731653 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.731625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wl7mb" event={"ID":"f6849759-b993-4293-a216-c7f7861f1c3f","Type":"ContainerStarted","Data":"ca1ac0268ffbf7656500d54518b6df63c17d795922fed71460b13d82341e4be1"} Apr 24 21:27:06.733630 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.733608 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s74k7" event={"ID":"23aa9667-569b-4627-bc38-54b145825a25","Type":"ContainerStarted","Data":"57b80d7daf8c35e3a0dbbb13ba891174ce5f135efa8a7e80f6f49fafff804b7f"} Apr 24 21:27:06.735330 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.735298 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-84qsj" event={"ID":"c85c2f4a-3310-4862-b16b-7dd95f352625","Type":"ContainerStarted","Data":"c3f260ef90c5dee6bc1efdc41d8017169aa6940d7e84d1b79dc244614efd8ed6"} Apr 24 21:27:06.738107 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.738083 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7mlvd" event={"ID":"94182b7e-3409-484c-82ea-df615ef6141e","Type":"ContainerStarted","Data":"c647663116c31baa0ef56aa5b665fe3d85f35403f2960b4f49114345c26897c6"} Apr 24 21:27:06.742438 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.742417 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal" event={"ID":"9fc64b0f94b5e05651be58c19d8e03a7","Type":"ContainerStarted","Data":"f4c6608ee646229bf3e4742cadd557c28dc7c7b0b72e67c824bf910b1fcaca51"} Apr 24 21:27:06.745377 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.745354 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" event={"ID":"d1c909a7-04ae-48f7-903a-e294140cc67e","Type":"ContainerStarted","Data":"630a805825fd930de462702d5efeb81a4f0230d135682b853bb6f4c2931b6c5b"} Apr 24 21:27:06.838748 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:06.838413 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:07.200698 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:07.199986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h56\" (UniqueName: \"kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56\") pod \"network-check-target-g2stj\" (UID: \"0c5854d5-4980-4604-9aa1-a757c380c0da\") " pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:07.200698 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:07.200166 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:07.200698 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:07.200183 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:07.200698 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:07.200196 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s7h56 for pod openshift-network-diagnostics/network-check-target-g2stj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:07.200698 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:07.200250 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56 podName:0c5854d5-4980-4604-9aa1-a757c380c0da nodeName:}" failed. No retries permitted until 2026-04-24 21:27:09.200232228 +0000 UTC m=+6.083218413 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7h56" (UniqueName: "kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56") pod "network-check-target-g2stj" (UID: "0c5854d5-4980-4604-9aa1-a757c380c0da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:07.301377 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:07.300863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:07.301377 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:07.300980 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:07.301377 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:07.301040 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs podName:c88b1b60-c919-439f-810d-ad2b2ecf4811 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:09.301023479 +0000 UTC m=+6.184009664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs") pod "network-metrics-daemon-r5pbt" (UID: "c88b1b60-c919-439f-810d-ad2b2ecf4811") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:07.700402 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:07.700323 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:07.700868 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:07.700437 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:07.700868 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:07.700530 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:07.700868 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:07.700630 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:07.754421 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:07.753264 2577 generic.go:358] "Generic (PLEG): container finished" podID="7f1b6e2046d6b808fbf08cbfde4e508e" containerID="051ee3defea7a787dec8f5cea1286db4f7523badc5ecee5a8b453f968fdf5f92" exitCode=0 Apr 24 21:27:07.754421 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:07.754213 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" event={"ID":"7f1b6e2046d6b808fbf08cbfde4e508e","Type":"ContainerDied","Data":"051ee3defea7a787dec8f5cea1286db4f7523badc5ecee5a8b453f968fdf5f92"} Apr 24 21:27:07.770899 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:07.770010 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-65.ec2.internal" podStartSLOduration=2.7699921180000002 podStartE2EDuration="2.769992118s" podCreationTimestamp="2026-04-24 21:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:06.762490872 +0000 UTC m=+3.645477075" watchObservedRunningTime="2026-04-24 21:27:07.769992118 +0000 UTC m=+4.652978322" Apr 24 21:27:08.760483 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:08.760446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" event={"ID":"7f1b6e2046d6b808fbf08cbfde4e508e","Type":"ContainerStarted","Data":"2811f5b8a157420f77ff975911a80920b3611f399a9fba2cb3d7b651ccfd59a9"} Apr 24 21:27:09.219799 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:09.219174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h56\" (UniqueName: \"kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56\") pod \"network-check-target-g2stj\" (UID: \"0c5854d5-4980-4604-9aa1-a757c380c0da\") " pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:09.219799 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:09.219331 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:09.219799 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:09.219352 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:09.219799 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:09.219365 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s7h56 for pod openshift-network-diagnostics/network-check-target-g2stj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:09.219799 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:09.219425 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56 podName:0c5854d5-4980-4604-9aa1-a757c380c0da nodeName:}" failed. No retries permitted until 2026-04-24 21:27:13.219407447 +0000 UTC m=+10.102393632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7h56" (UniqueName: "kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56") pod "network-check-target-g2stj" (UID: "0c5854d5-4980-4604-9aa1-a757c380c0da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:09.320305 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:09.320270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:09.320490 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:09.320438 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:09.320552 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:09.320503 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs podName:c88b1b60-c919-439f-810d-ad2b2ecf4811 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:13.320483606 +0000 UTC m=+10.203469786 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs") pod "network-metrics-daemon-r5pbt" (UID: "c88b1b60-c919-439f-810d-ad2b2ecf4811") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:09.698464 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:09.698096 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:09.698464 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:09.698224 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:09.698730 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:09.698100 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:09.698730 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:09.698701 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:11.697728 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:11.697702 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:11.698168 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:11.697733 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:11.698168 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:11.697806 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:11.698168 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:11.697921 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:13.252283 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:13.252248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h56\" (UniqueName: \"kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56\") pod \"network-check-target-g2stj\" (UID: \"0c5854d5-4980-4604-9aa1-a757c380c0da\") " pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:13.252778 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:13.252439 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:13.252778 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:13.252460 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:13.252778 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:13.252468 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s7h56 for pod openshift-network-diagnostics/network-check-target-g2stj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:13.252778 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:13.252528 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56 podName:0c5854d5-4980-4604-9aa1-a757c380c0da nodeName:}" failed. No retries permitted until 2026-04-24 21:27:21.252508458 +0000 UTC m=+18.135494640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7h56" (UniqueName: "kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56") pod "network-check-target-g2stj" (UID: "0c5854d5-4980-4604-9aa1-a757c380c0da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:13.353194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:13.353137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:13.353380 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:13.353256 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:13.353380 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:13.353319 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs podName:c88b1b60-c919-439f-810d-ad2b2ecf4811 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:21.353299214 +0000 UTC m=+18.236285394 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs") pod "network-metrics-daemon-r5pbt" (UID: "c88b1b60-c919-439f-810d-ad2b2ecf4811") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:13.698632 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:13.698593 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:13.698632 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:13.698634 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:13.698847 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:13.698723 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:13.699160 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:13.699132 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:15.697637 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:15.697605 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:15.697637 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:15.697628 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:15.698135 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:15.697721 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:15.698135 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:15.697883 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:17.697983 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:17.697946 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:17.698430 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:17.698083 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:17.698430 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:17.698140 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:17.698430 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:17.698250 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:19.698030 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:19.697986 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:19.698470 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:19.697995 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:19.698470 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:19.698091 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:19.698470 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:19.698228 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:21.308543 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:21.308509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h56\" (UniqueName: \"kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56\") pod \"network-check-target-g2stj\" (UID: \"0c5854d5-4980-4604-9aa1-a757c380c0da\") " pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:21.309074 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:21.308671 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:21.309074 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:21.308692 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:21.309074 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:21.308707 2577 projected.go:194] Error preparing data for projected volume kube-api-access-s7h56 for pod openshift-network-diagnostics/network-check-target-g2stj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:21.309074 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:21.308772 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56 podName:0c5854d5-4980-4604-9aa1-a757c380c0da nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.308754384 +0000 UTC m=+34.191740564 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s7h56" (UniqueName: "kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56") pod "network-check-target-g2stj" (UID: "0c5854d5-4980-4604-9aa1-a757c380c0da") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:21.409158 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:21.409120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:21.409331 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:21.409261 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:21.409331 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:21.409320 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs podName:c88b1b60-c919-439f-810d-ad2b2ecf4811 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.409305989 +0000 UTC m=+34.292292168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs") pod "network-metrics-daemon-r5pbt" (UID: "c88b1b60-c919-439f-810d-ad2b2ecf4811") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:21.698302 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:21.698258 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:21.698478 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:21.698380 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:21.698561 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:21.698477 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:21.698629 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:21.698609 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:23.698472 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.698179 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:23.698472 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:23.698385 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:23.698472 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.698225 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:23.698472 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:23.698456 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:23.787059 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.786913 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" event={"ID":"3ea34c37-0aa8-49fd-94af-4f419a1c9130","Type":"ContainerStarted","Data":"c62908e4663d5ab020a242915b10c4dd37439d38990dbb244455149d54729812"} Apr 24 21:27:23.788238 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.788206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wl7mb" event={"ID":"f6849759-b993-4293-a216-c7f7861f1c3f","Type":"ContainerStarted","Data":"c5a2a5b3abfbd8af91d28e5726f47845c2d5e61ccec24ff9a29a798b3e376b9d"} Apr 24 21:27:23.789452 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.789429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s74k7" event={"ID":"23aa9667-569b-4627-bc38-54b145825a25","Type":"ContainerStarted","Data":"62decff46a29e17f87810e62a0a61e5e5211872d4d5f7fc498983a9b73814b0f"} Apr 24 21:27:23.790624 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.790599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7mlvd" event={"ID":"94182b7e-3409-484c-82ea-df615ef6141e","Type":"ContainerStarted","Data":"ad0bc405f592dfdacc3a59b8280fab467b00b76e36b42fe57585ee637028bb6e"} Apr 24 21:27:23.791731 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.791712 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" event={"ID":"d1c909a7-04ae-48f7-903a-e294140cc67e","Type":"ContainerStarted","Data":"29729505c13289ef2f2405729629d392e1c28833f5cc7429799142fd0156806b"} Apr 24 21:27:23.793190 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.793172 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" event={"ID":"8e9546e8-52dd-4b70-a206-29bd990eb383","Type":"ContainerStarted","Data":"628dab0dded80ab8d3ae2e21b0b673e6eb862550bc1796d7aed8539bf9f8a338"} Apr 24 21:27:23.793254 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.793196 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" event={"ID":"8e9546e8-52dd-4b70-a206-29bd990eb383","Type":"ContainerStarted","Data":"194531d4947bcf39ec854165b03b47a2c8884aad18dc45ce045ce67c969979e0"} Apr 24 21:27:23.794345 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.794326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" event={"ID":"44478f44-ad28-4f73-9fd4-429d584502ef","Type":"ContainerStarted","Data":"771a1d76828e4dc3a61f317fd558aafd1998f2ae8209f60a309208d6146f5586"} Apr 24 21:27:23.795436 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.795418 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-672zj" event={"ID":"d5617082-ad48-4271-8c23-19c149807eba","Type":"ContainerStarted","Data":"87bc193cc958d98db3e92495c3a43e8d614e06966e09f0c7ad8c59c5c6a5f10f"} Apr 24 21:27:23.807927 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.807893 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-65.ec2.internal" podStartSLOduration=18.807882641 podStartE2EDuration="18.807882641s" podCreationTimestamp="2026-04-24 21:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:08.776403115 +0000 UTC m=+5.659389322" watchObservedRunningTime="2026-04-24 21:27:23.807882641 +0000 UTC m=+20.690868852" Apr 24 21:27:23.808286 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.808266 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wl7mb" podStartSLOduration=3.5306421119999998 podStartE2EDuration="20.808261717s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:06.395230063 +0000 UTC m=+3.278216250" lastFinishedPulling="2026-04-24 21:27:23.672849675 +0000 UTC m=+20.555835855" observedRunningTime="2026-04-24 21:27:23.807699867 +0000 UTC m=+20.690686087" watchObservedRunningTime="2026-04-24 21:27:23.808261717 +0000 UTC m=+20.691247918" Apr 24 21:27:23.834341 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.834295 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7mlvd" podStartSLOduration=8.48222393 podStartE2EDuration="20.83427865s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:06.36812322 +0000 UTC m=+3.251109401" lastFinishedPulling="2026-04-24 21:27:18.720177938 +0000 UTC m=+15.603164121" observedRunningTime="2026-04-24 21:27:23.833770028 +0000 UTC m=+20.716756232" watchObservedRunningTime="2026-04-24 21:27:23.83427865 +0000 UTC m=+20.717264865" Apr 24 21:27:23.879862 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.879810 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s74k7" podStartSLOduration=2.952657368 podStartE2EDuration="19.879793225s" podCreationTimestamp="2026-04-24 21:27:04 +0000 UTC" firstStartedPulling="2026-04-24 21:27:06.391883437 +0000 UTC m=+3.274869617" lastFinishedPulling="2026-04-24 21:27:23.31901928 +0000 UTC m=+20.202005474" observedRunningTime="2026-04-24 21:27:23.879186025 +0000 UTC m=+20.762172238" watchObservedRunningTime="2026-04-24 21:27:23.879793225 +0000 UTC m=+20.762779428" Apr 24 21:27:23.880157 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.880128 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-672zj" podStartSLOduration=3.92614041 podStartE2EDuration="20.880121919s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:06.365140769 +0000 UTC m=+3.248126953" lastFinishedPulling="2026-04-24 21:27:23.319122275 +0000 UTC m=+20.202108462" observedRunningTime="2026-04-24 21:27:23.862491437 +0000 UTC m=+20.745477640" watchObservedRunningTime="2026-04-24 21:27:23.880121919 +0000 UTC m=+20.763108121" Apr 24 21:27:23.899069 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.899028 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dc7wx" podStartSLOduration=2.9736326809999998 podStartE2EDuration="19.899013877s" podCreationTimestamp="2026-04-24 21:27:04 +0000 UTC" firstStartedPulling="2026-04-24 21:27:06.394986587 +0000 UTC m=+3.277972769" lastFinishedPulling="2026-04-24 21:27:23.320367782 +0000 UTC m=+20.203353965" observedRunningTime="2026-04-24 21:27:23.898732438 +0000 UTC m=+20.781718639" watchObservedRunningTime="2026-04-24 21:27:23.899013877 +0000 UTC m=+20.782000078" Apr 24 21:27:23.950327 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.950302 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-49ln2"] Apr 24 21:27:23.954187 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:23.954166 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:23.954279 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:23.954259 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49ln2" podUID="e4eb2c6b-a416-422e-a95a-e7759eca39e8" Apr 24 21:27:24.030200 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.030171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e4eb2c6b-a416-422e-a95a-e7759eca39e8-dbus\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:24.030294 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.030243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e4eb2c6b-a416-422e-a95a-e7759eca39e8-kubelet-config\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:24.030417 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.030394 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:24.131293 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.131206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e4eb2c6b-a416-422e-a95a-e7759eca39e8-kubelet-config\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:24.131293 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.131255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:24.131519 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.131329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e4eb2c6b-a416-422e-a95a-e7759eca39e8-dbus\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:24.131519 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.131350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e4eb2c6b-a416-422e-a95a-e7759eca39e8-kubelet-config\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:24.131519 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:24.131426 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:24.131519 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:24.131494 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret podName:e4eb2c6b-a416-422e-a95a-e7759eca39e8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:24.6314769 +0000 UTC m=+21.514463086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret") pod "global-pull-secret-syncer-49ln2" (UID: "e4eb2c6b-a416-422e-a95a-e7759eca39e8") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:24.131519 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.131506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e4eb2c6b-a416-422e-a95a-e7759eca39e8-dbus\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:24.486689 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.486666 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:24.633988 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.633962 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:24.634130 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:24.634067 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:24.634130 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:24.634114 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret podName:e4eb2c6b-a416-422e-a95a-e7759eca39e8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:25.634100727 +0000 UTC m=+22.517086907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret") pod "global-pull-secret-syncer-49ln2" (UID: "e4eb2c6b-a416-422e-a95a-e7759eca39e8") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:24.646825 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.646753 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:24.486684766Z","UUID":"af0ec0fd-fae6-41ce-ac1b-4a07aee3decb","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:24.649451 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.649431 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:24.649451 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.649456 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:24.797985 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.797951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-84qsj" event={"ID":"c85c2f4a-3310-4862-b16b-7dd95f352625","Type":"ContainerStarted","Data":"9519d9603d1700880c395a5324b0fd1df46f5e719eac24abc84972d2265370d8"} Apr 24 21:27:24.800384 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.800360 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" event={"ID":"8e9546e8-52dd-4b70-a206-29bd990eb383","Type":"ContainerStarted","Data":"c4e2a41ae46334d42707ebd65cce8c8ec9b61d86df380f991c2b909a2a8f9a20"} Apr 24 21:27:24.800499 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.800391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" event={"ID":"8e9546e8-52dd-4b70-a206-29bd990eb383","Type":"ContainerStarted","Data":"0e859a32e8ddd6ce5269d4372a659848a4f7e8e51518ca2449359be46e70c31e"} Apr 24 21:27:24.800499 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.800404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" event={"ID":"8e9546e8-52dd-4b70-a206-29bd990eb383","Type":"ContainerStarted","Data":"600561f8a846864bc45e0bece3694624c3f818a74a3b434b96573ce15f6a4722"} Apr 24 21:27:24.800499 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.800416 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" event={"ID":"8e9546e8-52dd-4b70-a206-29bd990eb383","Type":"ContainerStarted","Data":"b9565dbd83ab5cd647f705fb113d377c660a10caf50bfb0e89129e7081c1c162"} Apr 24 21:27:24.801518 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.801498 2577 generic.go:358] "Generic (PLEG): container finished" podID="44478f44-ad28-4f73-9fd4-429d584502ef" containerID="771a1d76828e4dc3a61f317fd558aafd1998f2ae8209f60a309208d6146f5586" exitCode=0 Apr 24 21:27:24.801610 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.801587 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" event={"ID":"44478f44-ad28-4f73-9fd4-429d584502ef","Type":"ContainerDied","Data":"771a1d76828e4dc3a61f317fd558aafd1998f2ae8209f60a309208d6146f5586"} Apr 24 21:27:24.803258 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.803236 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" event={"ID":"3ea34c37-0aa8-49fd-94af-4f419a1c9130","Type":"ContainerStarted","Data":"0853268899d4723e547782dd0dfdab1f3d9d0e3e9b5c1a2a348ff3c3fb352055"} Apr 24 21:27:24.813057 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:24.813021 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-84qsj" podStartSLOduration=9.481855301 podStartE2EDuration="21.813009494s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:06.389024941 +0000 UTC m=+3.272011121" lastFinishedPulling="2026-04-24 21:27:18.720179134 +0000 UTC m=+15.603165314" observedRunningTime="2026-04-24 21:27:24.812510283 +0000 UTC m=+21.695496489" watchObservedRunningTime="2026-04-24 21:27:24.813009494 +0000 UTC m=+21.695995733" Apr 24 21:27:25.360395 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:25.360176 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:25.642562 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:25.642529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:25.642766 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:25.642646 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:25.642766 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:25.642701 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret podName:e4eb2c6b-a416-422e-a95a-e7759eca39e8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.64268327 +0000 UTC m=+24.525669453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret") pod "global-pull-secret-syncer-49ln2" (UID: "e4eb2c6b-a416-422e-a95a-e7759eca39e8") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:25.698380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:25.698345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:25.698558 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:25.698345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:25.698558 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:25.698455 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:25.698558 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:25.698351 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:25.698558 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:25.698506 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49ln2" podUID="e4eb2c6b-a416-422e-a95a-e7759eca39e8" Apr 24 21:27:25.698780 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:25.698629 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:25.808342 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:25.808295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" event={"ID":"3ea34c37-0aa8-49fd-94af-4f419a1c9130","Type":"ContainerStarted","Data":"5ced3c9454a8d15257557b0b28991a8ef47cc4a85236e19b04bbe21553b873c4"} Apr 24 21:27:25.833423 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:25.833364 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4tjld" podStartSLOduration=4.041216633 podStartE2EDuration="22.833344813s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:06.395353384 +0000 UTC m=+3.278339564" lastFinishedPulling="2026-04-24 21:27:25.187481555 +0000 UTC m=+22.070467744" observedRunningTime="2026-04-24 21:27:25.832836445 +0000 UTC m=+22.715822647" watchObservedRunningTime="2026-04-24 21:27:25.833344813 +0000 UTC m=+22.716331014" Apr 24 21:27:26.813914 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:26.813879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" event={"ID":"8e9546e8-52dd-4b70-a206-29bd990eb383","Type":"ContainerStarted","Data":"54425aef6ca9c6798b2765ed2c64ec843ac2dc5323b6d3037591301309b73445"} Apr 24 21:27:26.926477 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:26.926439 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:26.927283 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:26.927259 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:27.658233 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:27.658183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:27.658429 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:27.658325 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:27.658429 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:27.658403 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret podName:e4eb2c6b-a416-422e-a95a-e7759eca39e8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.658382104 +0000 UTC m=+28.541368299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret") pod "global-pull-secret-syncer-49ln2" (UID: "e4eb2c6b-a416-422e-a95a-e7759eca39e8") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:27.698192 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:27.698146 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:27.698192 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:27.698172 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:27.698422 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:27.698258 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:27.698422 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:27.698269 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49ln2" podUID="e4eb2c6b-a416-422e-a95a-e7759eca39e8" Apr 24 21:27:27.698422 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:27.698347 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:27.698566 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:27.698465 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:27.816974 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:27.816941 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7mlvd" Apr 24 21:27:28.822018 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:28.821476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" event={"ID":"8e9546e8-52dd-4b70-a206-29bd990eb383","Type":"ContainerStarted","Data":"a5c910f932b28123db7b0869e69cb91466c4bcdb8f7696184e23b7de8d6a6d4d"} Apr 24 21:27:28.851653 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:28.851523 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" podStartSLOduration=7.636207619 podStartE2EDuration="24.851499364s" podCreationTimestamp="2026-04-24 21:27:04 +0000 UTC" firstStartedPulling="2026-04-24 21:27:06.368525552 +0000 UTC m=+3.251511732" lastFinishedPulling="2026-04-24 21:27:23.583817292 +0000 UTC m=+20.466803477" observedRunningTime="2026-04-24 21:27:28.851051584 +0000 UTC m=+25.734037796" watchObservedRunningTime="2026-04-24 21:27:28.851499364 +0000 UTC m=+25.734485565" Apr 24 21:27:29.698130 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.698105 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:29.698130 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.698118 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:29.698330 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.698118 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:29.698330 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:29.698191 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49ln2" podUID="e4eb2c6b-a416-422e-a95a-e7759eca39e8" Apr 24 21:27:29.698330 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:29.698304 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:29.698433 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:29.698387 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:29.825013 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.824975 2577 generic.go:358] "Generic (PLEG): container finished" podID="44478f44-ad28-4f73-9fd4-429d584502ef" containerID="716ecdc0afbfcdb1f2fac500c3fbb392392500fa6a846aa427d588cc5dc2f8de" exitCode=0 Apr 24 21:27:29.825393 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.825052 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" event={"ID":"44478f44-ad28-4f73-9fd4-429d584502ef","Type":"ContainerDied","Data":"716ecdc0afbfcdb1f2fac500c3fbb392392500fa6a846aa427d588cc5dc2f8de"} Apr 24 21:27:29.825786 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.825614 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:29.825786 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.825642 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:29.825786 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.825655 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:29.840019 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.839999 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:29.840133 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:29.840083 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:27:30.722309 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:30.722021 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-49ln2"] Apr 24 21:27:30.722469 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:30.722331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:30.722469 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:30.722421 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49ln2" podUID="e4eb2c6b-a416-422e-a95a-e7759eca39e8" Apr 24 21:27:30.725614 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:30.725590 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-g2stj"] Apr 24 21:27:30.725725 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:30.725678 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:30.725788 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:30.725769 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:30.726169 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:30.726148 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r5pbt"] Apr 24 21:27:30.726261 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:30.726245 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:30.726384 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:30.726361 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:30.828199 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:30.828166 2577 generic.go:358] "Generic (PLEG): container finished" podID="44478f44-ad28-4f73-9fd4-429d584502ef" containerID="54e6d9ec7584f41431ef841b6a05d43c68fd5da14fefa3dfee99b2fbb3e52d5a" exitCode=0 Apr 24 21:27:30.828561 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:30.828245 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" event={"ID":"44478f44-ad28-4f73-9fd4-429d584502ef","Type":"ContainerDied","Data":"54e6d9ec7584f41431ef841b6a05d43c68fd5da14fefa3dfee99b2fbb3e52d5a"} Apr 24 21:27:31.690240 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:31.690148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:31.690395 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:31.690315 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:31.690457 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:31.690397 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret podName:e4eb2c6b-a416-422e-a95a-e7759eca39e8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:39.690374765 +0000 UTC m=+36.573360949 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret") pod "global-pull-secret-syncer-49ln2" (UID: "e4eb2c6b-a416-422e-a95a-e7759eca39e8") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:31.832143 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:31.832107 2577 generic.go:358] "Generic (PLEG): container finished" podID="44478f44-ad28-4f73-9fd4-429d584502ef" containerID="6eb29e6fb2e11217afed4620ab6066ecd9ea4eeff198d42a1b9d72ef425123af" exitCode=0 Apr 24 21:27:31.832609 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:31.832188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" event={"ID":"44478f44-ad28-4f73-9fd4-429d584502ef","Type":"ContainerDied","Data":"6eb29e6fb2e11217afed4620ab6066ecd9ea4eeff198d42a1b9d72ef425123af"} Apr 24 21:27:32.697431 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:32.697398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:32.697642 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:32.697398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:32.697642 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:32.697460 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:32.697642 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:32.697601 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:32.697807 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:32.697680 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:32.697807 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:32.697767 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49ln2" podUID="e4eb2c6b-a416-422e-a95a-e7759eca39e8" Apr 24 21:27:34.697752 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:34.697677 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:34.697752 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:34.697706 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:34.698535 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:34.697678 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:34.698535 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:34.697788 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49ln2" podUID="e4eb2c6b-a416-422e-a95a-e7759eca39e8" Apr 24 21:27:34.698535 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:34.697856 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-g2stj" podUID="0c5854d5-4980-4604-9aa1-a757c380c0da" Apr 24 21:27:34.698535 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:34.697920 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r5pbt" podUID="c88b1b60-c919-439f-810d-ad2b2ecf4811" Apr 24 21:27:36.397762 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.397677 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-65.ec2.internal" event="NodeReady" Apr 24 21:27:36.398216 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.397840 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:27:36.436766 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.436704 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76b48cbfb7-6htxn"] Apr 24 21:27:36.469645 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.469015 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk"] Apr 24 21:27:36.469645 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.469261 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.473370 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.473309 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:27:36.473515 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.473406 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:27:36.473515 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.473429 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:27:36.473808 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.473542 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ttx96\"" Apr 24 21:27:36.480786 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.480763 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:27:36.487878 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.487629 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh"] Apr 24 21:27:36.487878 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.487821 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:36.490708 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.490687 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.490843 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.490826 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.490996 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.490735 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-b25rk\"" Apr 24 21:27:36.491363 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.491343 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:27:36.514853 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.514828 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j"] Apr 24 21:27:36.514988 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.514958 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh" Apr 24 21:27:36.517480 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.517462 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.517621 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.517462 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-nxvpg\"" Apr 24 21:27:36.517621 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.517464 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.537453 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.537430 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-54fb9fcb78-xvm5f"] Apr 24 21:27:36.537632 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.537613 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:36.540139 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.540119 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:27:36.540239 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.540121 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-p8rqx\"" Apr 24 21:27:36.540239 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.540206 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.541363 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.541342 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.541790 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.541771 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:27:36.552401 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.552381 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r269t"] Apr 24 21:27:36.552565 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.552542 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.555168 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.555148 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:27:36.555278 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.555251 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.555358 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.555322 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.555358 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.555347 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:27:36.555880 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.555857 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:27:36.555967 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.555918 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:27:36.556020 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.555987 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vn6v6\"" Apr 24 21:27:36.574789 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.574762 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5"] Apr 24 21:27:36.574955 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.574929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.577780 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.577760 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:27:36.577867 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.577761 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-6r59f\"" Apr 24 21:27:36.577867 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.577819 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:27:36.578135 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.578121 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.582714 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.582692 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:27:36.590149 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.590128 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.597899 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.597877 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pjgmr"] Apr 24 21:27:36.598033 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.598013 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.600491 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.600473 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:27:36.600607 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.600507 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.600607 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.600541 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-z7jrt\"" Apr 24 21:27:36.600866 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.600853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.600946 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.600899 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:27:36.623148 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.623120 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk"] Apr 24 21:27:36.623253 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.623220 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.625767 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.625740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-trusted-ca\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.625875 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.625774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzd2z\" (UniqueName: \"kubernetes.io/projected/fc4caba6-6af1-42ee-94d4-c7492907eb5a-kube-api-access-jzd2z\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:36.625875 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.625808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhpvf\" (UniqueName: \"kubernetes.io/projected/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-kube-api-access-jhpvf\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.625875 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.625841 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b5880055-3142-40fe-9ab5-cee4fa3d85e7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:36.625875 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.625867 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.626081 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.625924 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2zdn\" (UniqueName: \"kubernetes.io/projected/557745bb-3785-45f5-8eed-774938893b62-kube-api-access-m2zdn\") pod \"volume-data-source-validator-7c6cbb6c87-g6hwh\" (UID: \"557745bb-3785-45f5-8eed-774938893b62\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh" Apr 24 21:27:36.626081 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626008 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.626081 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626037 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:36.626081 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-default-certificate\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.626270 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d82sd\" (UniqueName: \"kubernetes.io/projected/b5880055-3142-40fe-9ab5-cee4fa3d85e7-kube-api-access-d82sd\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:36.626270 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5kt2\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-kube-api-access-m5kt2\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.626270 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626183 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:27:36.626270 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626235 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:27:36.626270 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626260 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:36.626270 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626264 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.626556 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626187 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-k69nf\"" Apr 24 21:27:36.626556 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626187 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.626556 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.626556 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-stats-auth\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.626556 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-certificates\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.626556 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-ca-trust-extracted\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.626822 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626567 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-installation-pull-secrets\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.626822 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-image-registry-private-configuration\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.626822 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.626648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-bound-sa-token\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.631376 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.631353 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:27:36.639267 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.639237 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr"] Apr 24 21:27:36.639386 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.639374 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk" Apr 24 21:27:36.642070 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.642006 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-tqsm7\"" Apr 24 21:27:36.642070 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.642021 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.642070 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.642052 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.662244 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.662176 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t"] Apr 24 21:27:36.662360 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.662314 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.665079 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.664994 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:27:36.665510 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.665485 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-h6x4r\"" Apr 24 21:27:36.665665 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.665647 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.665794 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.665776 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:27:36.668006 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.667983 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.684151 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.684128 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t"] Apr 24 21:27:36.684285 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.684260 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" Apr 24 21:27:36.686723 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.686701 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:27:36.686812 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.686733 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:27:36.686883 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.686840 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-dcz2c\"" Apr 24 21:27:36.686989 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.686975 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.687053 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.686976 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.700417 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.700394 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9"] Apr 24 21:27:36.700551 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.700528 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.703433 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.703408 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:27:36.703527 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.703409 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:27:36.703527 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.703448 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:27:36.703527 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.703479 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:27:36.716691 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716671 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76b48cbfb7-6htxn"] Apr 24 21:27:36.716691 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716694 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk"] Apr 24 21:27:36.716846 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716708 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5"] Apr 24 21:27:36.716846 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716720 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r269t"] Apr 24 21:27:36.716846 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716808 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:36.716846 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716811 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:36.716846 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716820 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-54fb9fcb78-xvm5f"] Apr 24 21:27:36.716846 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716816 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:36.716846 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716843 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5tnsp"] Apr 24 21:27:36.716846 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.716816 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.719600 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.719552 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:36.719600 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.719566 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:27:36.719761 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.719607 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xwzc6\"" Apr 24 21:27:36.719935 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.719916 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:27:36.720028 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.719984 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tshq6\"" Apr 24 21:27:36.727369 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-ca-trust-extracted\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.727466 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-bound-sa-token\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.727466 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-image-registry-private-configuration\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.727620 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07997d80-1c4d-46a5-a441-0ac4b389addb-config\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.727620 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727549 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-serving-cert\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.727620 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-trusted-ca\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.727777 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nklzr\" (UID: \"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.727777 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.727777 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b5880055-3142-40fe-9ab5-cee4fa3d85e7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:36.727777 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.727971 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-ca-trust-extracted\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.727971 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-default-certificate\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.727971 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727823 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07997d80-1c4d-46a5-a441-0ac4b389addb-serving-cert\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.727971 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nklzr\" (UID: \"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.727971 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.727824 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:36.727971 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjwv\" (UniqueName: \"kubernetes.io/projected/392b6d2c-0064-4bb6-b65f-6bef7161bc01-kube-api-access-psjwv\") pod \"service-ca-operator-d6fc45fc5-cj6t5\" (UID: \"392b6d2c-0064-4bb6-b65f-6bef7161bc01\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.727971 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5kt2\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-kube-api-access-m5kt2\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.727971 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.727929 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.227908229 +0000 UTC m=+34.110894422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : secret "router-metrics-certs-default" not found Apr 24 21:27:36.727971 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727959 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07997d80-1c4d-46a5-a441-0ac4b389addb-trusted-ca\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.727991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqqt7\" (UniqueName: \"kubernetes.io/projected/07997d80-1c4d-46a5-a441-0ac4b389addb-kube-api-access-wqqt7\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/392b6d2c-0064-4bb6-b65f-6bef7161bc01-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cj6t5\" (UID: \"392b6d2c-0064-4bb6-b65f-6bef7161bc01\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728043 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/392b6d2c-0064-4bb6-b65f-6bef7161bc01-config\") pod \"service-ca-operator-d6fc45fc5-cj6t5\" (UID: \"392b6d2c-0064-4bb6-b65f-6bef7161bc01\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-stats-auth\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.727862 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzd2z\" (UniqueName: \"kubernetes.io/projected/fc4caba6-6af1-42ee-94d4-c7492907eb5a-kube-api-access-jzd2z\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.728136 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76b48cbfb7-6htxn: secret "image-registry-tls" not found Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728188 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-installation-pull-secrets\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.728216 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls podName:3d2ccbb7-d5b5-4177-aab6-7905980c06f8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.228197231 +0000 UTC m=+34.111183411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls") pod "image-registry-76b48cbfb7-6htxn" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8") : secret "image-registry-tls" not found Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-certificates\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728302 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-tmp\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4mwd\" (UniqueName: \"kubernetes.io/projected/b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba-kube-api-access-c4mwd\") pod \"kube-storage-version-migrator-operator-6769c5d45-nklzr\" (UID: \"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.728380 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhpvf\" (UniqueName: \"kubernetes.io/projected/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-kube-api-access-jhpvf\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-service-ca-bundle\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-trusted-ca\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728538 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzr7\" (UniqueName: \"kubernetes.io/projected/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-kube-api-access-wxzr7\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.728846 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2zdn\" (UniqueName: \"kubernetes.io/projected/557745bb-3785-45f5-8eed-774938893b62-kube-api-access-m2zdn\") pod \"volume-data-source-validator-7c6cbb6c87-g6hwh\" (UID: \"557745bb-3785-45f5-8eed-774938893b62\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh" Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.728880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-certificates\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.728926 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls podName:fc4caba6-6af1-42ee-94d4-c7492907eb5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.22891263 +0000 UTC m=+34.111898810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-928mk" (UID: "fc4caba6-6af1-42ee-94d4-c7492907eb5a") : secret "samples-operator-tls" not found Apr 24 21:27:36.729063 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.729054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rjtv\" (UniqueName: \"kubernetes.io/projected/a5af1e3f-499d-4135-9910-eec6dffebf8e-kube-api-access-5rjtv\") pod \"network-check-source-8894fc9bd-5zhmk\" (UID: \"a5af1e3f-499d-4135-9910-eec6dffebf8e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk" Apr 24 21:27:36.729602 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.729101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-snapshots\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.729602 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.729140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d82sd\" (UniqueName: \"kubernetes.io/projected/b5880055-3142-40fe-9ab5-cee4fa3d85e7-kube-api-access-d82sd\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:36.729602 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.729176 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:36.729602 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.729204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.729602 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.729340 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.22931834 +0000 UTC m=+34.112304539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:36.729602 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.729409 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:36.729602 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.729445 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls podName:b5880055-3142-40fe-9ab5-cee4fa3d85e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.229435978 +0000 UTC m=+34.112422178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5r28j" (UID: "b5880055-3142-40fe-9ab5-cee4fa3d85e7") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:36.732873 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.732851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-image-registry-private-configuration\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.732965 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.732898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-default-certificate\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.732965 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.732911 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-installation-pull-secrets\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.732965 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.732898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-stats-auth\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.733838 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.733818 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t"] Apr 24 21:27:36.733918 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.733872 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr"] Apr 24 21:27:36.733918 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.733888 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pjgmr"] Apr 24 21:27:36.733918 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.733903 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l9zd5"] Apr 24 21:27:36.734076 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.733986 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:36.734493 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.734461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b5880055-3142-40fe-9ab5-cee4fa3d85e7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:36.736541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.736449 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:27:36.737036 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.736782 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j9cg5\"" Apr 24 21:27:36.737036 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.737003 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.737185 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.737040 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.739320 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.739297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5kt2\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-kube-api-access-m5kt2\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.740039 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.740015 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-bound-sa-token\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:36.740124 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.740081 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzd2z\" (UniqueName: \"kubernetes.io/projected/fc4caba6-6af1-42ee-94d4-c7492907eb5a-kube-api-access-jzd2z\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:36.742414 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.742390 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2zdn\" (UniqueName: \"kubernetes.io/projected/557745bb-3785-45f5-8eed-774938893b62-kube-api-access-m2zdn\") pod \"volume-data-source-validator-7c6cbb6c87-g6hwh\" (UID: \"557745bb-3785-45f5-8eed-774938893b62\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh" Apr 24 21:27:36.742526 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.742494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d82sd\" (UniqueName: \"kubernetes.io/projected/b5880055-3142-40fe-9ab5-cee4fa3d85e7-kube-api-access-d82sd\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:36.743278 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.743257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhpvf\" (UniqueName: \"kubernetes.io/projected/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-kube-api-access-jhpvf\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:36.759407 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.759385 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j"] Apr 24 21:27:36.759407 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.759409 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk"] Apr 24 21:27:36.759591 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.759425 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh"] Apr 24 21:27:36.759591 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.759437 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5tnsp"] Apr 24 21:27:36.759591 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.759559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.759924 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.759906 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l9zd5"] Apr 24 21:27:36.760010 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.759932 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t"] Apr 24 21:27:36.760010 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.759945 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9"] Apr 24 21:27:36.762137 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.762119 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:27:36.762232 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.762213 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:27:36.762439 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.762425 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m29p7\"" Apr 24 21:27:36.824173 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.824144 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh" Apr 24 21:27:36.830282 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-ca\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.830442 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07997d80-1c4d-46a5-a441-0ac4b389addb-config\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.830442 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-serving-cert\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.830442 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58babfb5-74e8-4175-b89d-bec0e2b2ea46-config-volume\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.830442 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nklzr\" (UID: \"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.830665 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ccf3a0b9-235d-45da-9881-2438bd55bcd7-tmp\") pod \"klusterlet-addon-workmgr-844f44667b-k7mm9\" (UID: \"ccf3a0b9-235d-45da-9881-2438bd55bcd7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.830665 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830627 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.830665 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07997d80-1c4d-46a5-a441-0ac4b389addb-serving-cert\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.830774 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nklzr\" (UID: \"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.830774 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psjwv\" (UniqueName: \"kubernetes.io/projected/392b6d2c-0064-4bb6-b65f-6bef7161bc01-kube-api-access-psjwv\") pod \"service-ca-operator-d6fc45fc5-cj6t5\" (UID: \"392b6d2c-0064-4bb6-b65f-6bef7161bc01\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.830774 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07997d80-1c4d-46a5-a441-0ac4b389addb-trusted-ca\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.830915 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830775 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqpz8\" (UniqueName: \"kubernetes.io/projected/f06595a3-8f61-4c8e-93ae-03b5b752052a-kube-api-access-lqpz8\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:36.830915 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqqt7\" (UniqueName: \"kubernetes.io/projected/07997d80-1c4d-46a5-a441-0ac4b389addb-kube-api-access-wqqt7\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.830915 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/392b6d2c-0064-4bb6-b65f-6bef7161bc01-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cj6t5\" (UID: \"392b6d2c-0064-4bb6-b65f-6bef7161bc01\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.830915 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/392b6d2c-0064-4bb6-b65f-6bef7161bc01-config\") pod \"service-ca-operator-d6fc45fc5-cj6t5\" (UID: \"392b6d2c-0064-4bb6-b65f-6bef7161bc01\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.830915 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.831146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3273bf0f-5e81-4792-af04-76455d6aa3a1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.831146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/58babfb5-74e8-4175-b89d-bec0e2b2ea46-tmp-dir\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.831146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.830995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:36.831146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831021 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ccf3a0b9-235d-45da-9881-2438bd55bcd7-klusterlet-config\") pod \"klusterlet-addon-workmgr-844f44667b-k7mm9\" (UID: \"ccf3a0b9-235d-45da-9881-2438bd55bcd7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.831146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831050 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-tmp\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.831146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4mwd\" (UniqueName: \"kubernetes.io/projected/b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba-kube-api-access-c4mwd\") pod \"kube-storage-version-migrator-operator-6769c5d45-nklzr\" (UID: \"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.831146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07997d80-1c4d-46a5-a441-0ac4b389addb-config\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.831146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.831146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831130 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-hub\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkp4c\" (UniqueName: \"kubernetes.io/projected/ccf3a0b9-235d-45da-9881-2438bd55bcd7-kube-api-access-nkp4c\") pod \"klusterlet-addon-workmgr-844f44667b-k7mm9\" (UID: \"ccf3a0b9-235d-45da-9881-2438bd55bcd7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d1163b16-6958-43ed-80ab-5aa4d58bdac6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-848fcb745c-pn75t\" (UID: \"d1163b16-6958-43ed-80ab-5aa4d58bdac6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831226 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nklzr\" (UID: \"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-service-ca-bundle\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzr7\" (UniqueName: \"kubernetes.io/projected/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-kube-api-access-wxzr7\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831318 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb6cr\" (UniqueName: \"kubernetes.io/projected/d1163b16-6958-43ed-80ab-5aa4d58bdac6-kube-api-access-qb6cr\") pod \"managed-serviceaccount-addon-agent-848fcb745c-pn75t\" (UID: \"d1163b16-6958-43ed-80ab-5aa4d58bdac6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831348 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rjtv\" (UniqueName: \"kubernetes.io/projected/a5af1e3f-499d-4135-9910-eec6dffebf8e-kube-api-access-5rjtv\") pod \"network-check-source-8894fc9bd-5zhmk\" (UID: \"a5af1e3f-499d-4135-9910-eec6dffebf8e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-snapshots\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk66d\" (UniqueName: \"kubernetes.io/projected/58babfb5-74e8-4175-b89d-bec0e2b2ea46-kube-api-access-sk66d\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.831590 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpks2\" (UniqueName: \"kubernetes.io/projected/3273bf0f-5e81-4792-af04-76455d6aa3a1-kube-api-access-vpks2\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.832093 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/392b6d2c-0064-4bb6-b65f-6bef7161bc01-config\") pod \"service-ca-operator-d6fc45fc5-cj6t5\" (UID: \"392b6d2c-0064-4bb6-b65f-6bef7161bc01\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.832093 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-tmp\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.832093 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.831968 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-service-ca-bundle\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.832534 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.832511 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-snapshots\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.832881 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.832862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.834176 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.834085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-serving-cert\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.834845 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.834812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/392b6d2c-0064-4bb6-b65f-6bef7161bc01-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cj6t5\" (UID: \"392b6d2c-0064-4bb6-b65f-6bef7161bc01\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.835243 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.835216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07997d80-1c4d-46a5-a441-0ac4b389addb-serving-cert\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.835563 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.835538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07997d80-1c4d-46a5-a441-0ac4b389addb-trusted-ca\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.837622 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.837601 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nklzr\" (UID: \"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.846995 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.846951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjwv\" (UniqueName: \"kubernetes.io/projected/392b6d2c-0064-4bb6-b65f-6bef7161bc01-kube-api-access-psjwv\") pod \"service-ca-operator-d6fc45fc5-cj6t5\" (UID: \"392b6d2c-0064-4bb6-b65f-6bef7161bc01\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.847101 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.847014 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzr7\" (UniqueName: \"kubernetes.io/projected/e5e1e857-05a9-48ac-ae67-6ad2198bbcf7-kube-api-access-wxzr7\") pod \"insights-operator-585dfdc468-r269t\" (UID: \"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7\") " pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.849593 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.849278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4mwd\" (UniqueName: \"kubernetes.io/projected/b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba-kube-api-access-c4mwd\") pod \"kube-storage-version-migrator-operator-6769c5d45-nklzr\" (UID: \"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:36.849593 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.849368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqqt7\" (UniqueName: \"kubernetes.io/projected/07997d80-1c4d-46a5-a441-0ac4b389addb-kube-api-access-wqqt7\") pod \"console-operator-9d4b6777b-pjgmr\" (UID: \"07997d80-1c4d-46a5-a441-0ac4b389addb\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.851506 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.851476 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rjtv\" (UniqueName: \"kubernetes.io/projected/a5af1e3f-499d-4135-9910-eec6dffebf8e-kube-api-access-5rjtv\") pod \"network-check-source-8894fc9bd-5zhmk\" (UID: \"a5af1e3f-499d-4135-9910-eec6dffebf8e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk" Apr 24 21:27:36.885471 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.885433 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-r269t" Apr 24 21:27:36.907234 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.907200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" Apr 24 21:27:36.932385 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb6cr\" (UniqueName: \"kubernetes.io/projected/d1163b16-6958-43ed-80ab-5aa4d58bdac6-kube-api-access-qb6cr\") pod \"managed-serviceaccount-addon-agent-848fcb745c-pn75t\" (UID: \"d1163b16-6958-43ed-80ab-5aa4d58bdac6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" Apr 24 21:27:36.932521 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk66d\" (UniqueName: \"kubernetes.io/projected/58babfb5-74e8-4175-b89d-bec0e2b2ea46-kube-api-access-sk66d\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.932521 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932425 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpks2\" (UniqueName: \"kubernetes.io/projected/3273bf0f-5e81-4792-af04-76455d6aa3a1-kube-api-access-vpks2\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.932521 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-ca\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.932521 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58babfb5-74e8-4175-b89d-bec0e2b2ea46-config-volume\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.932751 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ccf3a0b9-235d-45da-9881-2438bd55bcd7-tmp\") pod \"klusterlet-addon-workmgr-844f44667b-k7mm9\" (UID: \"ccf3a0b9-235d-45da-9881-2438bd55bcd7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.932751 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.932751 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqpz8\" (UniqueName: \"kubernetes.io/projected/f06595a3-8f61-4c8e-93ae-03b5b752052a-kube-api-access-lqpz8\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:36.932751 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.932751 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3273bf0f-5e81-4792-af04-76455d6aa3a1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.932751 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/58babfb5-74e8-4175-b89d-bec0e2b2ea46-tmp-dir\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.933020 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:36.933020 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ccf3a0b9-235d-45da-9881-2438bd55bcd7-klusterlet-config\") pod \"klusterlet-addon-workmgr-844f44667b-k7mm9\" (UID: \"ccf3a0b9-235d-45da-9881-2438bd55bcd7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.933020 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-hub\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.933020 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.932995 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:36.933208 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.933057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.933208 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.933088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkp4c\" (UniqueName: \"kubernetes.io/projected/ccf3a0b9-235d-45da-9881-2438bd55bcd7-kube-api-access-nkp4c\") pod \"klusterlet-addon-workmgr-844f44667b-k7mm9\" (UID: \"ccf3a0b9-235d-45da-9881-2438bd55bcd7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.933208 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.933113 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:36.933208 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.933174 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert podName:f06595a3-8f61-4c8e-93ae-03b5b752052a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.433155038 +0000 UTC m=+34.316141219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert") pod "ingress-canary-5tnsp" (UID: "f06595a3-8f61-4c8e-93ae-03b5b752052a") : secret "canary-serving-cert" not found Apr 24 21:27:36.933406 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.933035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ccf3a0b9-235d-45da-9881-2438bd55bcd7-tmp\") pod \"klusterlet-addon-workmgr-844f44667b-k7mm9\" (UID: \"ccf3a0b9-235d-45da-9881-2438bd55bcd7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.933454 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.933440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/58babfb5-74e8-4175-b89d-bec0e2b2ea46-tmp-dir\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.933508 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.933118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d1163b16-6958-43ed-80ab-5aa4d58bdac6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-848fcb745c-pn75t\" (UID: \"d1163b16-6958-43ed-80ab-5aa4d58bdac6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" Apr 24 21:27:36.933726 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.933633 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:36.933726 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:36.933700 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls podName:58babfb5-74e8-4175-b89d-bec0e2b2ea46 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.433683301 +0000 UTC m=+34.316669482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls") pod "dns-default-l9zd5" (UID: "58babfb5-74e8-4175-b89d-bec0e2b2ea46") : secret "dns-default-metrics-tls" not found Apr 24 21:27:36.934020 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.933962 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58babfb5-74e8-4175-b89d-bec0e2b2ea46-config-volume\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.935889 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.935867 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3273bf0f-5e81-4792-af04-76455d6aa3a1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.936048 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.936017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-ca\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.936238 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.936197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-hub\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.936238 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.936218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.936444 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.936424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3273bf0f-5e81-4792-af04-76455d6aa3a1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.937044 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.937026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ccf3a0b9-235d-45da-9881-2438bd55bcd7-klusterlet-config\") pod \"klusterlet-addon-workmgr-844f44667b-k7mm9\" (UID: \"ccf3a0b9-235d-45da-9881-2438bd55bcd7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.937777 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.937758 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d1163b16-6958-43ed-80ab-5aa4d58bdac6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-848fcb745c-pn75t\" (UID: \"d1163b16-6958-43ed-80ab-5aa4d58bdac6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" Apr 24 21:27:36.943501 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.943475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk66d\" (UniqueName: \"kubernetes.io/projected/58babfb5-74e8-4175-b89d-bec0e2b2ea46-kube-api-access-sk66d\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:36.943501 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.943491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqpz8\" (UniqueName: \"kubernetes.io/projected/f06595a3-8f61-4c8e-93ae-03b5b752052a-kube-api-access-lqpz8\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:36.943674 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.943474 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpks2\" (UniqueName: \"kubernetes.io/projected/3273bf0f-5e81-4792-af04-76455d6aa3a1-kube-api-access-vpks2\") pod \"cluster-proxy-proxy-agent-55b75bb794-j2j5t\" (UID: \"3273bf0f-5e81-4792-af04-76455d6aa3a1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:36.945550 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.945513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb6cr\" (UniqueName: \"kubernetes.io/projected/d1163b16-6958-43ed-80ab-5aa4d58bdac6-kube-api-access-qb6cr\") pod \"managed-serviceaccount-addon-agent-848fcb745c-pn75t\" (UID: \"d1163b16-6958-43ed-80ab-5aa4d58bdac6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" Apr 24 21:27:36.945862 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.945843 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkp4c\" (UniqueName: \"kubernetes.io/projected/ccf3a0b9-235d-45da-9881-2438bd55bcd7-kube-api-access-nkp4c\") pod \"klusterlet-addon-workmgr-844f44667b-k7mm9\" (UID: \"ccf3a0b9-235d-45da-9881-2438bd55bcd7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:36.948472 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.948451 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk" Apr 24 21:27:36.975303 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:36.975278 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" Apr 24 21:27:37.006061 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.006027 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" Apr 24 21:27:37.014909 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.014890 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:27:37.068066 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.068032 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:37.236599 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.236499 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:37.236599 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.236539 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:37.236804 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.236651 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:37.236804 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236687 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:37.236804 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.236703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:37.236804 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.236733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:37.236804 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236735 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:37.236804 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236770 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76b48cbfb7-6htxn: secret "image-registry-tls" not found Apr 24 21:27:37.236804 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236781 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.236758408 +0000 UTC m=+35.119744593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : secret "router-metrics-certs-default" not found Apr 24 21:27:37.236804 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236790 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:37.237134 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236813 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:37.237134 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236816 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls podName:3d2ccbb7-d5b5-4177-aab6-7905980c06f8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.236801191 +0000 UTC m=+35.119787387 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls") pod "image-registry-76b48cbfb7-6htxn" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8") : secret "image-registry-tls" not found Apr 24 21:27:37.237134 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236898 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls podName:b5880055-3142-40fe-9ab5-cee4fa3d85e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.236881355 +0000 UTC m=+35.119867540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5r28j" (UID: "b5880055-3142-40fe-9ab5-cee4fa3d85e7") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:37.237134 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236935 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls podName:fc4caba6-6af1-42ee-94d4-c7492907eb5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.236912363 +0000 UTC m=+35.119898546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-928mk" (UID: "fc4caba6-6af1-42ee-94d4-c7492907eb5a") : secret "samples-operator-tls" not found Apr 24 21:27:37.237134 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.236965 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.23695402 +0000 UTC m=+35.119940203 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:37.338355 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.338322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h56\" (UniqueName: \"kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56\") pod \"network-check-target-g2stj\" (UID: \"0c5854d5-4980-4604-9aa1-a757c380c0da\") " pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:37.340718 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.340692 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7h56\" (UniqueName: \"kubernetes.io/projected/0c5854d5-4980-4604-9aa1-a757c380c0da-kube-api-access-s7h56\") pod \"network-check-target-g2stj\" (UID: \"0c5854d5-4980-4604-9aa1-a757c380c0da\") " pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:37.381920 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.381890 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:37.439589 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.439355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:37.439924 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.439495 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:37.439924 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.439704 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:37.439924 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.439637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:27:37.439924 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.439707 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert podName:f06595a3-8f61-4c8e-93ae-03b5b752052a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.439686161 +0000 UTC m=+35.322672344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert") pod "ingress-canary-5tnsp" (UID: "f06595a3-8f61-4c8e-93ae-03b5b752052a") : secret "canary-serving-cert" not found Apr 24 21:27:37.439924 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.439765 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs podName:c88b1b60-c919-439f-810d-ad2b2ecf4811 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:09.439750102 +0000 UTC m=+66.322736297 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs") pod "network-metrics-daemon-r5pbt" (UID: "c88b1b60-c919-439f-810d-ad2b2ecf4811") : secret "metrics-daemon-secret" not found Apr 24 21:27:37.439924 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.439863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:37.440112 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.439932 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:37.440112 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:37.439968 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls podName:58babfb5-74e8-4175-b89d-bec0e2b2ea46 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.439960941 +0000 UTC m=+35.322947124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls") pod "dns-default-l9zd5" (UID: "58babfb5-74e8-4175-b89d-bec0e2b2ea46") : secret "dns-default-metrics-tls" not found Apr 24 21:27:37.735323 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.735264 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-g2stj"] Apr 24 21:27:37.748130 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.740791 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pjgmr"] Apr 24 21:27:37.749806 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:37.749723 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5854d5_4980_4604_9aa1_a757c380c0da.slice/crio-37c6539488f53faad40d58a5fd8e74a5f63065bcbdc85846f5b148c420164924 WatchSource:0}: Error finding container 37c6539488f53faad40d58a5fd8e74a5f63065bcbdc85846f5b148c420164924: Status 404 returned error can't find the container with id 37c6539488f53faad40d58a5fd8e74a5f63065bcbdc85846f5b148c420164924 Apr 24 21:27:37.751527 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:37.751216 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07997d80_1c4d_46a5_a441_0ac4b389addb.slice/crio-f0355403bbd2a40a485ead3bba6defe2f5dbc3ff39ded161287fa38b552069ae WatchSource:0}: Error finding container f0355403bbd2a40a485ead3bba6defe2f5dbc3ff39ded161287fa38b552069ae: Status 404 returned error can't find the container with id f0355403bbd2a40a485ead3bba6defe2f5dbc3ff39ded161287fa38b552069ae Apr 24 21:27:37.761830 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.761705 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t"] Apr 24 21:27:37.768770 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.768728 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr"] Apr 24 21:27:37.772867 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:37.772832 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5480cfe_1e95_43c8_a907_f0b5c7dcb7ba.slice/crio-1161c4a327e4ff1b976b411cb68af6e1f94ce3cdce677070a5c26edad585d0de WatchSource:0}: Error finding container 1161c4a327e4ff1b976b411cb68af6e1f94ce3cdce677070a5c26edad585d0de: Status 404 returned error can't find the container with id 1161c4a327e4ff1b976b411cb68af6e1f94ce3cdce677070a5c26edad585d0de Apr 24 21:27:37.776254 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.776176 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5"] Apr 24 21:27:37.780111 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.780089 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9"] Apr 24 21:27:37.782232 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:37.782188 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392b6d2c_0064_4bb6_b65f_6bef7161bc01.slice/crio-f4ef4ca6e8c41d6ff9ec627c4a5d44bb96467075ce969830ef061daf7688b815 WatchSource:0}: Error finding container f4ef4ca6e8c41d6ff9ec627c4a5d44bb96467075ce969830ef061daf7688b815: Status 404 returned error can't find the container with id f4ef4ca6e8c41d6ff9ec627c4a5d44bb96467075ce969830ef061daf7688b815 Apr 24 21:27:37.792378 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:37.792356 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccf3a0b9_235d_45da_9881_2438bd55bcd7.slice/crio-8d3d77565da4f0bd6c862f3d4517e2c1b97fd7a04bc638474792179014febbb6 WatchSource:0}: Error finding container 8d3d77565da4f0bd6c862f3d4517e2c1b97fd7a04bc638474792179014febbb6: Status 404 returned error can't find the container with id 8d3d77565da4f0bd6c862f3d4517e2c1b97fd7a04bc638474792179014febbb6 Apr 24 21:27:37.799566 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.799178 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh"] Apr 24 21:27:37.800227 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.800208 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk"] Apr 24 21:27:37.801003 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.800984 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r269t"] Apr 24 21:27:37.809744 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:37.809716 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5af1e3f_499d_4135_9910_eec6dffebf8e.slice/crio-8f0b135295bd916aca08a34c890120a4fb1a532a8cef0da359145f1979f70a61 WatchSource:0}: Error finding container 8f0b135295bd916aca08a34c890120a4fb1a532a8cef0da359145f1979f70a61: Status 404 returned error can't find the container with id 8f0b135295bd916aca08a34c890120a4fb1a532a8cef0da359145f1979f70a61 Apr 24 21:27:37.811437 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:37.811065 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e1e857_05a9_48ac_ae67_6ad2198bbcf7.slice/crio-ff002ef376fa0d65ffb1a7bb2e0bed7816ab527625e3d4fda32def9da7443328 WatchSource:0}: Error finding container ff002ef376fa0d65ffb1a7bb2e0bed7816ab527625e3d4fda32def9da7443328: Status 404 returned error can't find the container with id ff002ef376fa0d65ffb1a7bb2e0bed7816ab527625e3d4fda32def9da7443328 Apr 24 21:27:37.812338 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:37.811965 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557745bb_3785_45f5_8eed_774938893b62.slice/crio-c98dd6e37938285d6291a11c4cbe00e5077fdbc02fdee39164afac05869c1935 WatchSource:0}: Error finding container c98dd6e37938285d6291a11c4cbe00e5077fdbc02fdee39164afac05869c1935: Status 404 returned error can't find the container with id c98dd6e37938285d6291a11c4cbe00e5077fdbc02fdee39164afac05869c1935 Apr 24 21:27:37.812338 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.812206 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t"] Apr 24 21:27:37.815631 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:37.815613 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3273bf0f_5e81_4792_af04_76455d6aa3a1.slice/crio-000c3f3374a152ff878e5bf5d54c868bcbcaff95c789bcfd5d69c991e2bed0bc WatchSource:0}: Error finding container 000c3f3374a152ff878e5bf5d54c868bcbcaff95c789bcfd5d69c991e2bed0bc: Status 404 returned error can't find the container with id 000c3f3374a152ff878e5bf5d54c868bcbcaff95c789bcfd5d69c991e2bed0bc Apr 24 21:27:37.845646 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.845606 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" event={"ID":"ccf3a0b9-235d-45da-9881-2438bd55bcd7","Type":"ContainerStarted","Data":"8d3d77565da4f0bd6c862f3d4517e2c1b97fd7a04bc638474792179014febbb6"} Apr 24 21:27:37.846714 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.846686 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" event={"ID":"3273bf0f-5e81-4792-af04-76455d6aa3a1","Type":"ContainerStarted","Data":"000c3f3374a152ff878e5bf5d54c868bcbcaff95c789bcfd5d69c991e2bed0bc"} Apr 24 21:27:37.848146 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.848107 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh" event={"ID":"557745bb-3785-45f5-8eed-774938893b62","Type":"ContainerStarted","Data":"c98dd6e37938285d6291a11c4cbe00e5077fdbc02fdee39164afac05869c1935"} Apr 24 21:27:37.851176 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.851148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" event={"ID":"d1163b16-6958-43ed-80ab-5aa4d58bdac6","Type":"ContainerStarted","Data":"fe9b86aa63047b14abd4ca4700612d50794b5aa508f885d80b81d5bc5b2895f2"} Apr 24 21:27:37.852208 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.852181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r269t" event={"ID":"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7","Type":"ContainerStarted","Data":"ff002ef376fa0d65ffb1a7bb2e0bed7816ab527625e3d4fda32def9da7443328"} Apr 24 21:27:37.853209 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.853188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-g2stj" event={"ID":"0c5854d5-4980-4604-9aa1-a757c380c0da","Type":"ContainerStarted","Data":"37c6539488f53faad40d58a5fd8e74a5f63065bcbdc85846f5b148c420164924"} Apr 24 21:27:37.854014 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.853991 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" event={"ID":"07997d80-1c4d-46a5-a441-0ac4b389addb","Type":"ContainerStarted","Data":"f0355403bbd2a40a485ead3bba6defe2f5dbc3ff39ded161287fa38b552069ae"} Apr 24 21:27:37.854794 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.854776 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk" event={"ID":"a5af1e3f-499d-4135-9910-eec6dffebf8e","Type":"ContainerStarted","Data":"8f0b135295bd916aca08a34c890120a4fb1a532a8cef0da359145f1979f70a61"} Apr 24 21:27:37.855701 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.855681 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" event={"ID":"392b6d2c-0064-4bb6-b65f-6bef7161bc01","Type":"ContainerStarted","Data":"f4ef4ca6e8c41d6ff9ec627c4a5d44bb96467075ce969830ef061daf7688b815"} Apr 24 21:27:37.856601 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:37.856568 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" event={"ID":"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba","Type":"ContainerStarted","Data":"1161c4a327e4ff1b976b411cb68af6e1f94ce3cdce677070a5c26edad585d0de"} Apr 24 21:27:38.251943 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:38.251913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:38.252132 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:38.251967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:38.252132 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:38.252002 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:38.252132 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:38.252062 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:38.252132 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252077 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:38.252132 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:38.252087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:38.252385 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252144 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls podName:fc4caba6-6af1-42ee-94d4-c7492907eb5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.252124294 +0000 UTC m=+37.135110489 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-928mk" (UID: "fc4caba6-6af1-42ee-94d4-c7492907eb5a") : secret "samples-operator-tls" not found Apr 24 21:27:38.252385 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252158 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:38.252385 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252183 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:38.252385 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252195 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76b48cbfb7-6htxn: secret "image-registry-tls" not found Apr 24 21:27:38.252385 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252227 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls podName:b5880055-3142-40fe-9ab5-cee4fa3d85e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.252208553 +0000 UTC m=+37.135194744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5r28j" (UID: "b5880055-3142-40fe-9ab5-cee4fa3d85e7") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:38.252385 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252246 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls podName:3d2ccbb7-d5b5-4177-aab6-7905980c06f8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.252237078 +0000 UTC m=+37.135223263 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls") pod "image-registry-76b48cbfb7-6htxn" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8") : secret "image-registry-tls" not found Apr 24 21:27:38.252385 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252289 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.252277217 +0000 UTC m=+37.135263416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:38.252385 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252320 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:38.252385 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.252361 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.252350671 +0000 UTC m=+37.135336857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : secret "router-metrics-certs-default" not found Apr 24 21:27:38.454495 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:38.454452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:38.455022 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:38.454516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:38.455022 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.454870 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:38.455022 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.454939 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert podName:f06595a3-8f61-4c8e-93ae-03b5b752052a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.454918556 +0000 UTC m=+37.337904741 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert") pod "ingress-canary-5tnsp" (UID: "f06595a3-8f61-4c8e-93ae-03b5b752052a") : secret "canary-serving-cert" not found Apr 24 21:27:38.455463 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.455327 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:38.455463 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:38.455385 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls podName:58babfb5-74e8-4175-b89d-bec0e2b2ea46 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.455368522 +0000 UTC m=+37.338354709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls") pod "dns-default-l9zd5" (UID: "58babfb5-74e8-4175-b89d-bec0e2b2ea46") : secret "dns-default-metrics-tls" not found Apr 24 21:27:38.902645 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:38.901508 2577 generic.go:358] "Generic (PLEG): container finished" podID="44478f44-ad28-4f73-9fd4-429d584502ef" containerID="8883a4ed64527d69425c6f153ce8ecf5d2bf8098726e734a0586b6b0cf441e8e" exitCode=0 Apr 24 21:27:38.902645 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:38.901606 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" event={"ID":"44478f44-ad28-4f73-9fd4-429d584502ef","Type":"ContainerDied","Data":"8883a4ed64527d69425c6f153ce8ecf5d2bf8098726e734a0586b6b0cf441e8e"} Apr 24 21:27:39.772448 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:39.772163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:39.797331 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:39.797264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4eb2c6b-a416-422e-a95a-e7759eca39e8-original-pull-secret\") pod \"global-pull-secret-syncer-49ln2\" (UID: \"e4eb2c6b-a416-422e-a95a-e7759eca39e8\") " pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:39.948730 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:39.948688 2577 generic.go:358] "Generic (PLEG): container finished" podID="44478f44-ad28-4f73-9fd4-429d584502ef" containerID="bf920e3001a9c3e5e7367f097c8fd1e9c5a482790faa1555e0d118396e202552" exitCode=0 Apr 24 21:27:39.948905 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:39.948750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" event={"ID":"44478f44-ad28-4f73-9fd4-429d584502ef","Type":"ContainerDied","Data":"bf920e3001a9c3e5e7367f097c8fd1e9c5a482790faa1555e0d118396e202552"} Apr 24 21:27:40.075411 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:40.075324 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49ln2" Apr 24 21:27:40.278397 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:40.278356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:40.278589 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:40.278435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:40.278589 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:40.278468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:40.278589 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:40.278527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:40.278589 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:40.278552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:40.278900 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.278877 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:40.278969 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.278953 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls podName:b5880055-3142-40fe-9ab5-cee4fa3d85e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.278931935 +0000 UTC m=+41.161918121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5r28j" (UID: "b5880055-3142-40fe-9ab5-cee4fa3d85e7") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:40.279401 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.279379 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:40.279514 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.279436 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls podName:fc4caba6-6af1-42ee-94d4-c7492907eb5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.279421527 +0000 UTC m=+41.162407708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-928mk" (UID: "fc4caba6-6af1-42ee-94d4-c7492907eb5a") : secret "samples-operator-tls" not found Apr 24 21:27:40.279627 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.279517 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.279506766 +0000 UTC m=+41.162492952 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:40.279627 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.279597 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:40.279732 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.279633 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.279622671 +0000 UTC m=+41.162608866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : secret "router-metrics-certs-default" not found Apr 24 21:27:40.279916 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.279898 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:40.280001 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.279922 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76b48cbfb7-6htxn: secret "image-registry-tls" not found Apr 24 21:27:40.280001 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.279969 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls podName:3d2ccbb7-d5b5-4177-aab6-7905980c06f8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.279954892 +0000 UTC m=+41.162941076 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls") pod "image-registry-76b48cbfb7-6htxn" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8") : secret "image-registry-tls" not found Apr 24 21:27:40.482558 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:40.480870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:40.482558 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:40.480996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:40.482558 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.481286 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:40.482558 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.481348 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert podName:f06595a3-8f61-4c8e-93ae-03b5b752052a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.48132854 +0000 UTC m=+41.364314724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert") pod "ingress-canary-5tnsp" (UID: "f06595a3-8f61-4c8e-93ae-03b5b752052a") : secret "canary-serving-cert" not found Apr 24 21:27:40.482558 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.481768 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:40.482558 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:40.481819 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls podName:58babfb5-74e8-4175-b89d-bec0e2b2ea46 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.481803853 +0000 UTC m=+41.364790033 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls") pod "dns-default-l9zd5" (UID: "58babfb5-74e8-4175-b89d-bec0e2b2ea46") : secret "dns-default-metrics-tls" not found Apr 24 21:27:44.318419 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:44.318377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:44.318451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:44.318482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:44.318532 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318548 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:44.318555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318644 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls podName:fc4caba6-6af1-42ee-94d4-c7492907eb5a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.318621949 +0000 UTC m=+49.201608130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-928mk" (UID: "fc4caba6-6af1-42ee-94d4-c7492907eb5a") : secret "samples-operator-tls" not found Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318645 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318666 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76b48cbfb7-6htxn: secret "image-registry-tls" not found Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318701 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318704 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.31869298 +0000 UTC m=+49.201679164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318725 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls podName:3d2ccbb7-d5b5-4177-aab6-7905980c06f8 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.318715964 +0000 UTC m=+49.201702146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls") pod "image-registry-76b48cbfb7-6htxn" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8") : secret "image-registry-tls" not found Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318741 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.318733314 +0000 UTC m=+49.201719499 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : secret "router-metrics-certs-default" not found Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318794 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:44.319221 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.318837 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls podName:b5880055-3142-40fe-9ab5-cee4fa3d85e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.31882743 +0000 UTC m=+49.201813615 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5r28j" (UID: "b5880055-3142-40fe-9ab5-cee4fa3d85e7") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:44.521547 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:44.521507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:44.521752 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:44.521564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:44.521752 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.521683 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:44.521752 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.521753 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls podName:58babfb5-74e8-4175-b89d-bec0e2b2ea46 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.521734993 +0000 UTC m=+49.404721175 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls") pod "dns-default-l9zd5" (UID: "58babfb5-74e8-4175-b89d-bec0e2b2ea46") : secret "dns-default-metrics-tls" not found Apr 24 21:27:44.521921 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.521683 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:44.521921 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:44.521843 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert podName:f06595a3-8f61-4c8e-93ae-03b5b752052a nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.521825404 +0000 UTC m=+49.404811584 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert") pod "ingress-canary-5tnsp" (UID: "f06595a3-8f61-4c8e-93ae-03b5b752052a") : secret "canary-serving-cert" not found Apr 24 21:27:50.769325 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.769294 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-49ln2"] Apr 24 21:27:50.780243 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:50.780094 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4eb2c6b_a416_422e_a95a_e7759eca39e8.slice/crio-84e2ed186e372e49e74498b11bfb85f2140c63a0bb459128bb4d17b07fc0e97a WatchSource:0}: Error finding container 84e2ed186e372e49e74498b11bfb85f2140c63a0bb459128bb4d17b07fc0e97a: Status 404 returned error can't find the container with id 84e2ed186e372e49e74498b11bfb85f2140c63a0bb459128bb4d17b07fc0e97a Apr 24 21:27:50.978249 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.977413 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r269t" event={"ID":"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7","Type":"ContainerStarted","Data":"7d2dc396267f2232499706be5f2c2ec05ba3af27581648555d85ee4ea9e73014"} Apr 24 21:27:50.980090 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.980061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-g2stj" event={"ID":"0c5854d5-4980-4604-9aa1-a757c380c0da","Type":"ContainerStarted","Data":"dc719d8246e01321e0eeb5ce8522caa8dad50eea0374c7a16a196f37b6468330"} Apr 24 21:27:50.983233 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.983203 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:27:50.983359 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.983349 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:50.983480 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.983466 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" event={"ID":"07997d80-1c4d-46a5-a441-0ac4b389addb","Type":"ContainerStarted","Data":"f80a4fcaadbf2db70e38eec91efdd682fe7d6278eb1c35f2427d7f7a9b261776"} Apr 24 21:27:50.985120 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.984302 2577 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-pjgmr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.11:8443/readyz\": dial tcp 10.132.0.11:8443: connect: connection refused" start-of-body= Apr 24 21:27:50.985120 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.984351 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" podUID="07997d80-1c4d-46a5-a441-0ac4b389addb" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.11:8443/readyz\": dial tcp 10.132.0.11:8443: connect: connection refused" Apr 24 21:27:50.986759 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.986335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk" event={"ID":"a5af1e3f-499d-4135-9910-eec6dffebf8e","Type":"ContainerStarted","Data":"f47ac68a2ce3ebedd1795475dc4423661878f751d6d0ee73136ff643008047d6"} Apr 24 21:27:50.988695 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.988560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" event={"ID":"392b6d2c-0064-4bb6-b65f-6bef7161bc01","Type":"ContainerStarted","Data":"7373fcc537799d0de9855a3c6acd35cef1f80f6a7d7a2741b162bc97fdb983ad"} Apr 24 21:27:50.990929 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.990907 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" event={"ID":"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba","Type":"ContainerStarted","Data":"b846c153f66105cb7ad71880d4285e0e9dabf01e79e675e00334d35a7897c704"} Apr 24 21:27:50.993355 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.993333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" event={"ID":"ccf3a0b9-235d-45da-9881-2438bd55bcd7","Type":"ContainerStarted","Data":"5d28ca1a86d9fa036d859d30789521913071efdc9487d36c12950d2b9d2cc453"} Apr 24 21:27:50.993894 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.993875 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:50.995645 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.995629 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" Apr 24 21:27:50.998719 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:50.998610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" event={"ID":"44478f44-ad28-4f73-9fd4-429d584502ef","Type":"ContainerStarted","Data":"6446db6f493f18ea5c086ee3e368e4aacd59449095c285bacd8c59d0e0407635"} Apr 24 21:27:51.001139 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.001095 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-r269t" podStartSLOduration=13.330664365 podStartE2EDuration="26.00106773s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.813101449 +0000 UTC m=+34.696087630" lastFinishedPulling="2026-04-24 21:27:50.483504802 +0000 UTC m=+47.366490995" observedRunningTime="2026-04-24 21:27:50.999887831 +0000 UTC m=+47.882874033" watchObservedRunningTime="2026-04-24 21:27:51.00106773 +0000 UTC m=+47.884053931" Apr 24 21:27:51.001754 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.001725 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" event={"ID":"3273bf0f-5e81-4792-af04-76455d6aa3a1","Type":"ContainerStarted","Data":"0805d8f4214c45d5a5efa863dbcf8b89c694255b3107bda8d6eba2137d3cf87e"} Apr 24 21:27:51.004754 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.003829 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh" event={"ID":"557745bb-3785-45f5-8eed-774938893b62","Type":"ContainerStarted","Data":"b94d0dbd1f1b0b0c9fbb5aaaf5e7d8e6ef336224abe27d241e92496eee0e34c0"} Apr 24 21:27:51.008990 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.008186 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" event={"ID":"d1163b16-6958-43ed-80ab-5aa4d58bdac6","Type":"ContainerStarted","Data":"a23bb76462d858244f6244867da3cf761a2557c1a9d5419a914bc0d89021a0c2"} Apr 24 21:27:51.009743 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.009642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-49ln2" event={"ID":"e4eb2c6b-a416-422e-a95a-e7759eca39e8","Type":"ContainerStarted","Data":"84e2ed186e372e49e74498b11bfb85f2140c63a0bb459128bb4d17b07fc0e97a"} Apr 24 21:27:51.040287 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.040233 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" podStartSLOduration=13.203004374 podStartE2EDuration="26.040215988s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.784104514 +0000 UTC m=+34.667090697" lastFinishedPulling="2026-04-24 21:27:50.621316128 +0000 UTC m=+47.504302311" observedRunningTime="2026-04-24 21:27:51.039313207 +0000 UTC m=+47.922299407" watchObservedRunningTime="2026-04-24 21:27:51.040215988 +0000 UTC m=+47.923202194" Apr 24 21:27:51.040863 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.040822 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" podStartSLOduration=13.19463549 podStartE2EDuration="26.0408149s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.776386501 +0000 UTC m=+34.659372681" lastFinishedPulling="2026-04-24 21:27:50.622565894 +0000 UTC m=+47.505552091" observedRunningTime="2026-04-24 21:27:51.019535226 +0000 UTC m=+47.902521423" watchObservedRunningTime="2026-04-24 21:27:51.0408149 +0000 UTC m=+47.923801104" Apr 24 21:27:51.057048 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.056960 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-844f44667b-k7mm9" podStartSLOduration=7.231572658 podStartE2EDuration="20.056943942s" podCreationTimestamp="2026-04-24 21:27:31 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.794598014 +0000 UTC m=+34.677584202" lastFinishedPulling="2026-04-24 21:27:50.619969302 +0000 UTC m=+47.502955486" observedRunningTime="2026-04-24 21:27:51.056515395 +0000 UTC m=+47.939501600" watchObservedRunningTime="2026-04-24 21:27:51.056943942 +0000 UTC m=+47.939930144" Apr 24 21:27:51.083305 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.083255 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pvrrx" podStartSLOduration=16.659755758 podStartE2EDuration="48.083239491s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:06.366700002 +0000 UTC m=+3.249686187" lastFinishedPulling="2026-04-24 21:27:37.790183734 +0000 UTC m=+34.673169920" observedRunningTime="2026-04-24 21:27:51.081228768 +0000 UTC m=+47.964214971" watchObservedRunningTime="2026-04-24 21:27:51.083239491 +0000 UTC m=+47.966225694" Apr 24 21:27:51.099169 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.099122 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-g2stj" podStartSLOduration=35.233609093 podStartE2EDuration="48.099110335s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.754329616 +0000 UTC m=+34.637315797" lastFinishedPulling="2026-04-24 21:27:50.619830845 +0000 UTC m=+47.502817039" observedRunningTime="2026-04-24 21:27:51.098486477 +0000 UTC m=+47.981472678" watchObservedRunningTime="2026-04-24 21:27:51.099110335 +0000 UTC m=+47.982096538" Apr 24 21:27:51.117099 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.117056 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" podStartSLOduration=13.245796841 podStartE2EDuration="26.117044261s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.755138978 +0000 UTC m=+34.638125172" lastFinishedPulling="2026-04-24 21:27:50.626386411 +0000 UTC m=+47.509372592" observedRunningTime="2026-04-24 21:27:51.116789092 +0000 UTC m=+47.999775296" watchObservedRunningTime="2026-04-24 21:27:51.117044261 +0000 UTC m=+48.000030489" Apr 24 21:27:51.135993 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.135945 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5zhmk" podStartSLOduration=13.309080036 podStartE2EDuration="26.135908979s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.812267321 +0000 UTC m=+34.695253513" lastFinishedPulling="2026-04-24 21:27:50.639096275 +0000 UTC m=+47.522082456" observedRunningTime="2026-04-24 21:27:51.134560042 +0000 UTC m=+48.017546256" watchObservedRunningTime="2026-04-24 21:27:51.135908979 +0000 UTC m=+48.018895184" Apr 24 21:27:51.152203 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.152157 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-g6hwh" podStartSLOduration=13.482300962 podStartE2EDuration="26.152143245s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.813751041 +0000 UTC m=+34.696737222" lastFinishedPulling="2026-04-24 21:27:50.48359331 +0000 UTC m=+47.366579505" observedRunningTime="2026-04-24 21:27:51.151385818 +0000 UTC m=+48.034372021" watchObservedRunningTime="2026-04-24 21:27:51.152143245 +0000 UTC m=+48.035129446" Apr 24 21:27:51.173322 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:51.173282 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-848fcb745c-pn75t" podStartSLOduration=7.323495115 podStartE2EDuration="20.173269195s" podCreationTimestamp="2026-04-24 21:27:31 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.773335692 +0000 UTC m=+34.656321876" lastFinishedPulling="2026-04-24 21:27:50.623109765 +0000 UTC m=+47.506095956" observedRunningTime="2026-04-24 21:27:51.17281698 +0000 UTC m=+48.055803183" watchObservedRunningTime="2026-04-24 21:27:51.173269195 +0000 UTC m=+48.056255396" Apr 24 21:27:52.016617 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.016095 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/0.log" Apr 24 21:27:52.016617 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.016139 2577 generic.go:358] "Generic (PLEG): container finished" podID="07997d80-1c4d-46a5-a441-0ac4b389addb" containerID="f80a4fcaadbf2db70e38eec91efdd682fe7d6278eb1c35f2427d7f7a9b261776" exitCode=255 Apr 24 21:27:52.018059 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.017528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" event={"ID":"07997d80-1c4d-46a5-a441-0ac4b389addb","Type":"ContainerDied","Data":"f80a4fcaadbf2db70e38eec91efdd682fe7d6278eb1c35f2427d7f7a9b261776"} Apr 24 21:27:52.027275 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.020940 2577 scope.go:117] "RemoveContainer" containerID="f80a4fcaadbf2db70e38eec91efdd682fe7d6278eb1c35f2427d7f7a9b261776" Apr 24 21:27:52.397522 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.397482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:27:52.397714 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.397559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:27:52.397714 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.397606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:52.397714 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.397662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:27:52.397714 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.397687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:27:52.397934 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.397851 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:52.397934 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.397914 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls podName:fc4caba6-6af1-42ee-94d4-c7492907eb5a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.397893901 +0000 UTC m=+65.280880084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-928mk" (UID: "fc4caba6-6af1-42ee-94d4-c7492907eb5a") : secret "samples-operator-tls" not found Apr 24 21:27:52.398417 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.398244 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:52.398417 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.398269 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-76b48cbfb7-6htxn: secret "image-registry-tls" not found Apr 24 21:27:52.398417 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.398313 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:52.398417 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.398343 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.398146036 +0000 UTC m=+65.281132230 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:52.398417 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.398364 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:52.398417 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.398374 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls podName:3d2ccbb7-d5b5-4177-aab6-7905980c06f8 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.398358094 +0000 UTC m=+65.281344288 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls") pod "image-registry-76b48cbfb7-6htxn" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8") : secret "image-registry-tls" not found Apr 24 21:27:52.398417 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.398399 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs podName:4eb7ba39-8089-4d1b-a005-3ecfaf77739a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.3983907 +0000 UTC m=+65.281376893 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs") pod "router-default-54fb9fcb78-xvm5f" (UID: "4eb7ba39-8089-4d1b-a005-3ecfaf77739a") : secret "router-metrics-certs-default" not found Apr 24 21:27:52.398417 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.398414 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls podName:b5880055-3142-40fe-9ab5-cee4fa3d85e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.398406111 +0000 UTC m=+65.281392292 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-5r28j" (UID: "b5880055-3142-40fe-9ab5-cee4fa3d85e7") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:52.599976 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.599941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:27:52.600147 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.599999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:27:52.600147 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.600129 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:52.600147 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.600138 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:52.600358 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.600175 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert podName:f06595a3-8f61-4c8e-93ae-03b5b752052a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.600162316 +0000 UTC m=+65.483148496 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert") pod "ingress-canary-5tnsp" (UID: "f06595a3-8f61-4c8e-93ae-03b5b752052a") : secret "canary-serving-cert" not found Apr 24 21:27:52.600358 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:52.600198 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls podName:58babfb5-74e8-4175-b89d-bec0e2b2ea46 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.600180765 +0000 UTC m=+65.483166952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls") pod "dns-default-l9zd5" (UID: "58babfb5-74e8-4175-b89d-bec0e2b2ea46") : secret "dns-default-metrics-tls" not found Apr 24 21:27:52.866625 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.866338 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr"] Apr 24 21:27:52.886497 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.886465 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr"] Apr 24 21:27:52.886696 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.886660 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr" Apr 24 21:27:52.889414 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.889393 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:27:52.889607 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.889593 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:52.889762 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.889748 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-k2gh7\"" Apr 24 21:27:52.903541 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.902729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j646n\" (UniqueName: \"kubernetes.io/projected/befbf90c-3ad0-4580-95b5-aab6da3d5a4d-kube-api-access-j646n\") pod \"migrator-74bb7799d9-lslhr\" (UID: \"befbf90c-3ad0-4580-95b5-aab6da3d5a4d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr" Apr 24 21:27:52.978894 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:52.978832 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7phz8"] Apr 24 21:27:53.003332 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.003251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j646n\" (UniqueName: \"kubernetes.io/projected/befbf90c-3ad0-4580-95b5-aab6da3d5a4d-kube-api-access-j646n\") pod \"migrator-74bb7799d9-lslhr\" (UID: \"befbf90c-3ad0-4580-95b5-aab6da3d5a4d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr" Apr 24 21:27:53.004711 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.003874 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7phz8"] Apr 24 21:27:53.004711 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.004029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.007190 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.006601 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:27:53.007190 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.006831 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hhbdt\"" Apr 24 21:27:53.007190 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.007016 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:27:53.019148 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.019092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j646n\" (UniqueName: \"kubernetes.io/projected/befbf90c-3ad0-4580-95b5-aab6da3d5a4d-kube-api-access-j646n\") pod \"migrator-74bb7799d9-lslhr\" (UID: \"befbf90c-3ad0-4580-95b5-aab6da3d5a4d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr" Apr 24 21:27:53.022263 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.022231 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:27:53.022775 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.022697 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/0.log" Apr 24 21:27:53.022775 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.022740 2577 generic.go:358] "Generic (PLEG): container finished" podID="07997d80-1c4d-46a5-a441-0ac4b389addb" containerID="6fabb25e684378baa65e513d7ab20ecb00bef9a33a7310c552f4c6188505a595" exitCode=255 Apr 24 21:27:53.022936 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.022839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" event={"ID":"07997d80-1c4d-46a5-a441-0ac4b389addb","Type":"ContainerDied","Data":"6fabb25e684378baa65e513d7ab20ecb00bef9a33a7310c552f4c6188505a595"} Apr 24 21:27:53.022936 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.022874 2577 scope.go:117] "RemoveContainer" containerID="f80a4fcaadbf2db70e38eec91efdd682fe7d6278eb1c35f2427d7f7a9b261776" Apr 24 21:27:53.023102 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.023075 2577 scope.go:117] "RemoveContainer" containerID="6fabb25e684378baa65e513d7ab20ecb00bef9a33a7310c552f4c6188505a595" Apr 24 21:27:53.023532 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:53.023296 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pjgmr_openshift-console-operator(07997d80-1c4d-46a5-a441-0ac4b389addb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" podUID="07997d80-1c4d-46a5-a441-0ac4b389addb" Apr 24 21:27:53.097973 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.097777 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs"] Apr 24 21:27:53.099416 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.099389 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s74k7_23aa9667-569b-4627-bc38-54b145825a25/dns-node-resolver/0.log" Apr 24 21:27:53.104877 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.104850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.105733 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.105063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.105733 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.105284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-crio-socket\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.105733 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.105391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-data-volume\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.105733 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.105541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gr8t\" (UniqueName: \"kubernetes.io/projected/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-kube-api-access-4gr8t\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.122056 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.121985 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs"] Apr 24 21:27:53.122219 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.122126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:27:53.124993 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.124966 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:27:53.125210 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.125192 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:27:53.125284 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.125246 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7fxnn\"" Apr 24 21:27:53.199634 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.199593 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr" Apr 24 21:27:53.206851 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.206821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-data-volume\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.207006 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.206874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gr8t\" (UniqueName: \"kubernetes.io/projected/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-kube-api-access-4gr8t\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.207006 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.206982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/63a8c1d1-a439-488e-aad2-c0e97e0625ea-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:27:53.207122 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.207021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.207122 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.207101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.207245 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.207221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-crio-socket\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.207245 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:53.207241 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:27:53.207362 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.207281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-crio-socket\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.207362 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.207289 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:27:53.207362 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:53.207299 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls podName:9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:53.707285284 +0000 UTC m=+50.590271465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7phz8" (UID: "9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7") : secret "insights-runtime-extractor-tls" not found Apr 24 21:27:53.207362 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.207224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-data-volume\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.207724 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.207703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.218158 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.218128 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gr8t\" (UniqueName: \"kubernetes.io/projected/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-kube-api-access-4gr8t\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.308315 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.308261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:27:53.308491 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.308365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/63a8c1d1-a439-488e-aad2-c0e97e0625ea-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:27:53.308697 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:53.308667 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:53.308833 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:53.308748 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert podName:63a8c1d1-a439-488e-aad2-c0e97e0625ea nodeName:}" failed. No retries permitted until 2026-04-24 21:27:53.80872892 +0000 UTC m=+50.691715105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dv6rs" (UID: "63a8c1d1-a439-488e-aad2-c0e97e0625ea") : secret "networking-console-plugin-cert" not found Apr 24 21:27:53.309159 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.309138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/63a8c1d1-a439-488e-aad2-c0e97e0625ea-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:27:53.566828 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.566797 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr"] Apr 24 21:27:53.570079 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:53.570048 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbefbf90c_3ad0_4580_95b5_aab6da3d5a4d.slice/crio-ff5c5d0962042602adfb2853de115a45b38570919afe3a9fd353e93642732551 WatchSource:0}: Error finding container ff5c5d0962042602adfb2853de115a45b38570919afe3a9fd353e93642732551: Status 404 returned error can't find the container with id ff5c5d0962042602adfb2853de115a45b38570919afe3a9fd353e93642732551 Apr 24 21:27:53.711504 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.711423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:53.711687 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:53.711596 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:27:53.711687 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:53.711668 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls podName:9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:54.711648567 +0000 UTC m=+51.594634762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7phz8" (UID: "9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7") : secret "insights-runtime-extractor-tls" not found Apr 24 21:27:53.812374 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.812337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:27:53.812658 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:53.812495 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:53.812658 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:53.812591 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert podName:63a8c1d1-a439-488e-aad2-c0e97e0625ea nodeName:}" failed. No retries permitted until 2026-04-24 21:27:54.812553766 +0000 UTC m=+51.695539946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dv6rs" (UID: "63a8c1d1-a439-488e-aad2-c0e97e0625ea") : secret "networking-console-plugin-cert" not found Apr 24 21:27:53.887662 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:53.887630 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-672zj_d5617082-ad48-4271-8c23-19c149807eba/node-ca/0.log" Apr 24 21:27:54.026869 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.026783 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr" event={"ID":"befbf90c-3ad0-4580-95b5-aab6da3d5a4d","Type":"ContainerStarted","Data":"ff5c5d0962042602adfb2853de115a45b38570919afe3a9fd353e93642732551"} Apr 24 21:27:54.028437 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.028416 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:27:54.028908 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.028888 2577 scope.go:117] "RemoveContainer" containerID="6fabb25e684378baa65e513d7ab20ecb00bef9a33a7310c552f4c6188505a595" Apr 24 21:27:54.029138 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:54.029111 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pjgmr_openshift-console-operator(07997d80-1c4d-46a5-a441-0ac4b389addb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" podUID="07997d80-1c4d-46a5-a441-0ac4b389addb" Apr 24 21:27:54.670134 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.670090 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9k5cc"] Apr 24 21:27:54.693559 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.693524 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9k5cc"] Apr 24 21:27:54.693741 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.693669 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:54.696465 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.696443 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:27:54.696629 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.696443 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:27:54.697868 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.697730 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:27:54.697868 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.697753 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:27:54.697868 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.697752 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-kpjmw\"" Apr 24 21:27:54.722353 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.722329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/945f05d0-62d0-48d6-8d1a-4639c1211757-signing-key\") pod \"service-ca-865cb79987-9k5cc\" (UID: \"945f05d0-62d0-48d6-8d1a-4639c1211757\") " pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:54.722537 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.722518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:54.722633 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.722554 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/945f05d0-62d0-48d6-8d1a-4639c1211757-signing-cabundle\") pod \"service-ca-865cb79987-9k5cc\" (UID: \"945f05d0-62d0-48d6-8d1a-4639c1211757\") " pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:54.722692 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:54.722680 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:27:54.722745 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.722714 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78pk\" (UniqueName: \"kubernetes.io/projected/945f05d0-62d0-48d6-8d1a-4639c1211757-kube-api-access-m78pk\") pod \"service-ca-865cb79987-9k5cc\" (UID: \"945f05d0-62d0-48d6-8d1a-4639c1211757\") " pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:54.722745 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:54.722738 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls podName:9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:56.722717716 +0000 UTC m=+53.605703896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7phz8" (UID: "9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7") : secret "insights-runtime-extractor-tls" not found Apr 24 21:27:54.823763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.823730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/945f05d0-62d0-48d6-8d1a-4639c1211757-signing-cabundle\") pod \"service-ca-865cb79987-9k5cc\" (UID: \"945f05d0-62d0-48d6-8d1a-4639c1211757\") " pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:54.823939 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.823782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m78pk\" (UniqueName: \"kubernetes.io/projected/945f05d0-62d0-48d6-8d1a-4639c1211757-kube-api-access-m78pk\") pod \"service-ca-865cb79987-9k5cc\" (UID: \"945f05d0-62d0-48d6-8d1a-4639c1211757\") " pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:54.823939 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.823827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/945f05d0-62d0-48d6-8d1a-4639c1211757-signing-key\") pod \"service-ca-865cb79987-9k5cc\" (UID: \"945f05d0-62d0-48d6-8d1a-4639c1211757\") " pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:54.824054 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.823981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:27:54.824138 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:54.824114 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:54.824219 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:54.824196 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert podName:63a8c1d1-a439-488e-aad2-c0e97e0625ea nodeName:}" failed. No retries permitted until 2026-04-24 21:27:56.8241792 +0000 UTC m=+53.707165389 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dv6rs" (UID: "63a8c1d1-a439-488e-aad2-c0e97e0625ea") : secret "networking-console-plugin-cert" not found Apr 24 21:27:54.824543 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.824515 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/945f05d0-62d0-48d6-8d1a-4639c1211757-signing-cabundle\") pod \"service-ca-865cb79987-9k5cc\" (UID: \"945f05d0-62d0-48d6-8d1a-4639c1211757\") " pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:54.826639 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.826618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/945f05d0-62d0-48d6-8d1a-4639c1211757-signing-key\") pod \"service-ca-865cb79987-9k5cc\" (UID: \"945f05d0-62d0-48d6-8d1a-4639c1211757\") " pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:54.834152 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:54.834133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78pk\" (UniqueName: \"kubernetes.io/projected/945f05d0-62d0-48d6-8d1a-4639c1211757-kube-api-access-m78pk\") pod \"service-ca-865cb79987-9k5cc\" (UID: \"945f05d0-62d0-48d6-8d1a-4639c1211757\") " pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:55.004746 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:55.004665 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-9k5cc" Apr 24 21:27:55.381302 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:55.381267 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-9k5cc"] Apr 24 21:27:55.414497 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:27:55.414466 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod945f05d0_62d0_48d6_8d1a_4639c1211757.slice/crio-d175f9db4b0e151ac13d77f1e5766e0775cefbd4a2c55272ce8f8997229af3de WatchSource:0}: Error finding container d175f9db4b0e151ac13d77f1e5766e0775cefbd4a2c55272ce8f8997229af3de: Status 404 returned error can't find the container with id d175f9db4b0e151ac13d77f1e5766e0775cefbd4a2c55272ce8f8997229af3de Apr 24 21:27:56.038667 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.038620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" event={"ID":"3273bf0f-5e81-4792-af04-76455d6aa3a1","Type":"ContainerStarted","Data":"d640b41c9a62eda82c91c6e26077046e1998c0441d66f25b8d23c47049034006"} Apr 24 21:27:56.038667 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.038669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" event={"ID":"3273bf0f-5e81-4792-af04-76455d6aa3a1","Type":"ContainerStarted","Data":"120e4d398d093ddcab24e1ef254c590e4e9e4d9ee77b2de5d154026125c9451a"} Apr 24 21:27:56.040286 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.040260 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-49ln2" event={"ID":"e4eb2c6b-a416-422e-a95a-e7759eca39e8","Type":"ContainerStarted","Data":"3e9a29dced74928238b76e086bc45f28b879a8790f6a6c19bc3d58fc82f189f2"} Apr 24 21:27:56.041726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.041701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9k5cc" event={"ID":"945f05d0-62d0-48d6-8d1a-4639c1211757","Type":"ContainerStarted","Data":"06cd22c6f9a81354e5fc960c257da646e3686f06d7c9e0b3efb59dea43aaec74"} Apr 24 21:27:56.041835 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.041734 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-9k5cc" event={"ID":"945f05d0-62d0-48d6-8d1a-4639c1211757","Type":"ContainerStarted","Data":"d175f9db4b0e151ac13d77f1e5766e0775cefbd4a2c55272ce8f8997229af3de"} Apr 24 21:27:56.065299 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.065244 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" podStartSLOduration=7.627794622 podStartE2EDuration="25.065226478s" podCreationTimestamp="2026-04-24 21:27:31 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.817513528 +0000 UTC m=+34.700499712" lastFinishedPulling="2026-04-24 21:27:55.254945375 +0000 UTC m=+52.137931568" observedRunningTime="2026-04-24 21:27:56.064281736 +0000 UTC m=+52.947267938" watchObservedRunningTime="2026-04-24 21:27:56.065226478 +0000 UTC m=+52.948212682" Apr 24 21:27:56.098943 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.098896 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-9k5cc" podStartSLOduration=2.098880687 podStartE2EDuration="2.098880687s" podCreationTimestamp="2026-04-24 21:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:56.097645043 +0000 UTC m=+52.980631247" watchObservedRunningTime="2026-04-24 21:27:56.098880687 +0000 UTC m=+52.981866950" Apr 24 21:27:56.744080 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:56.743646 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:27:56.744080 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:56.743729 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls podName:9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:00.743708802 +0000 UTC m=+57.626694991 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7phz8" (UID: "9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7") : secret "insights-runtime-extractor-tls" not found Apr 24 21:27:56.744629 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.743493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:27:56.845897 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.845861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:27:56.846108 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:56.846026 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:56.846108 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:56.846106 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert podName:63a8c1d1-a439-488e-aad2-c0e97e0625ea nodeName:}" failed. No retries permitted until 2026-04-24 21:28:00.846086048 +0000 UTC m=+57.729072235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dv6rs" (UID: "63a8c1d1-a439-488e-aad2-c0e97e0625ea") : secret "networking-console-plugin-cert" not found Apr 24 21:27:56.933290 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.933256 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:27:56.933745 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:56.933729 2577 scope.go:117] "RemoveContainer" containerID="6fabb25e684378baa65e513d7ab20ecb00bef9a33a7310c552f4c6188505a595" Apr 24 21:27:56.933966 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:27:56.933946 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pjgmr_openshift-console-operator(07997d80-1c4d-46a5-a441-0ac4b389addb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" podUID="07997d80-1c4d-46a5-a441-0ac4b389addb" Apr 24 21:27:57.046863 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:57.046712 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr" event={"ID":"befbf90c-3ad0-4580-95b5-aab6da3d5a4d","Type":"ContainerStarted","Data":"04441c795688330a8147b4ae968dfe4736551117aaba17e627d8a62e2bfa501f"} Apr 24 21:27:57.046863 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:57.046765 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr" event={"ID":"befbf90c-3ad0-4580-95b5-aab6da3d5a4d","Type":"ContainerStarted","Data":"9a718ada5aa44d511208aa6d92930136d4a42d7c5a886b360c6f3dba85c3cbdf"} Apr 24 21:27:57.066108 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:57.066045 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lslhr" podStartSLOduration=2.504845736 podStartE2EDuration="5.066027423s" podCreationTimestamp="2026-04-24 21:27:52 +0000 UTC" firstStartedPulling="2026-04-24 21:27:53.572490214 +0000 UTC m=+50.455476394" lastFinishedPulling="2026-04-24 21:27:56.133671901 +0000 UTC m=+53.016658081" observedRunningTime="2026-04-24 21:27:57.06441265 +0000 UTC m=+53.947398853" watchObservedRunningTime="2026-04-24 21:27:57.066027423 +0000 UTC m=+53.949013626" Apr 24 21:27:57.066674 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:27:57.066643 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-49ln2" podStartSLOduration=29.416927592 podStartE2EDuration="34.066632595s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:50.783004236 +0000 UTC m=+47.665990428" lastFinishedPulling="2026-04-24 21:27:55.432709251 +0000 UTC m=+52.315695431" observedRunningTime="2026-04-24 21:27:56.11369043 +0000 UTC m=+52.996676633" watchObservedRunningTime="2026-04-24 21:27:57.066632595 +0000 UTC m=+53.949618798" Apr 24 21:28:00.787502 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:00.787460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:28:00.787898 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:28:00.787626 2577 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:00.787898 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:28:00.787696 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls podName:9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.787677547 +0000 UTC m=+65.670663729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7phz8" (UID: "9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:00.888890 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:00.888847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:28:00.889049 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:28:00.888998 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:00.889091 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:28:00.889073 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert podName:63a8c1d1-a439-488e-aad2-c0e97e0625ea nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.889055866 +0000 UTC m=+65.772042046 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-dv6rs" (UID: "63a8c1d1-a439-488e-aad2-c0e97e0625ea") : secret "networking-console-plugin-cert" not found Apr 24 21:28:00.984152 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:00.984104 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:28:00.984658 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:00.984642 2577 scope.go:117] "RemoveContainer" containerID="6fabb25e684378baa65e513d7ab20ecb00bef9a33a7310c552f4c6188505a595" Apr 24 21:28:00.984900 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:28:00.984879 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pjgmr_openshift-console-operator(07997d80-1c4d-46a5-a441-0ac4b389addb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" podUID="07997d80-1c4d-46a5-a441-0ac4b389addb" Apr 24 21:28:01.852352 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:01.852324 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvpcz" Apr 24 21:28:07.016236 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:07.016195 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" podUID="3273bf0f-5e81-4792-af04-76455d6aa3a1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:28:08.459253 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.459217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:28:08.459253 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.459257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:28:08.459777 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.459399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:28:08.459777 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.459446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:28:08.459777 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.459476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:28:08.460190 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.460166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-service-ca-bundle\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:28:08.461884 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.461853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"image-registry-76b48cbfb7-6htxn\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:28:08.461983 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.461900 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b5880055-3142-40fe-9ab5-cee4fa3d85e7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-5r28j\" (UID: \"b5880055-3142-40fe-9ab5-cee4fa3d85e7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:28:08.462150 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.462126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fc4caba6-6af1-42ee-94d4-c7492907eb5a-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-928mk\" (UID: \"fc4caba6-6af1-42ee-94d4-c7492907eb5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:28:08.462187 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.462133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eb7ba39-8089-4d1b-a005-3ecfaf77739a-metrics-certs\") pod \"router-default-54fb9fcb78-xvm5f\" (UID: \"4eb7ba39-8089-4d1b-a005-3ecfaf77739a\") " pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:28:08.589129 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.589096 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ttx96\"" Apr 24 21:28:08.597608 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.597587 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:28:08.602113 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.602093 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-b25rk\"" Apr 24 21:28:08.609462 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.609441 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" Apr 24 21:28:08.650943 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.650910 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-p8rqx\"" Apr 24 21:28:08.658392 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.658357 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" Apr 24 21:28:08.662099 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.661542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:28:08.662099 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.661617 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:28:08.668686 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.668426 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vn6v6\"" Apr 24 21:28:08.669817 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.669764 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f06595a3-8f61-4c8e-93ae-03b5b752052a-cert\") pod \"ingress-canary-5tnsp\" (UID: \"f06595a3-8f61-4c8e-93ae-03b5b752052a\") " pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:28:08.670553 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.670535 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58babfb5-74e8-4175-b89d-bec0e2b2ea46-metrics-tls\") pod \"dns-default-l9zd5\" (UID: \"58babfb5-74e8-4175-b89d-bec0e2b2ea46\") " pod="openshift-dns/dns-default-l9zd5" Apr 24 21:28:08.676820 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.676663 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:28:08.746344 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.746257 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76b48cbfb7-6htxn"] Apr 24 21:28:08.754405 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:28:08.752821 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d2ccbb7_d5b5_4177_aab6_7905980c06f8.slice/crio-ba85af33803178db1c25aed868ed1a99ed30795da76ef944a34ba6393d80eb14 WatchSource:0}: Error finding container ba85af33803178db1c25aed868ed1a99ed30795da76ef944a34ba6393d80eb14: Status 404 returned error can't find the container with id ba85af33803178db1c25aed868ed1a99ed30795da76ef944a34ba6393d80eb14 Apr 24 21:28:08.766400 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.766374 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk"] Apr 24 21:28:08.817227 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.817201 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j"] Apr 24 21:28:08.828942 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:28:08.828913 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5880055_3142_40fe_9ab5_cee4fa3d85e7.slice/crio-30ee588ce3edf5590f7fd8720d5622bca20078aac20a27180a05437cbb6c3771 WatchSource:0}: Error finding container 30ee588ce3edf5590f7fd8720d5622bca20078aac20a27180a05437cbb6c3771: Status 404 returned error can't find the container with id 30ee588ce3edf5590f7fd8720d5622bca20078aac20a27180a05437cbb6c3771 Apr 24 21:28:08.841943 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.841905 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-54fb9fcb78-xvm5f"] Apr 24 21:28:08.845434 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:28:08.845410 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eb7ba39_8089_4d1b_a005_3ecfaf77739a.slice/crio-1a7481e41580ecd57d31d5d8179f1274bdc335d734812e92f3651bebde46927e WatchSource:0}: Error finding container 1a7481e41580ecd57d31d5d8179f1274bdc335d734812e92f3651bebde46927e: Status 404 returned error can't find the container with id 1a7481e41580ecd57d31d5d8179f1274bdc335d734812e92f3651bebde46927e Apr 24 21:28:08.863686 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.863657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:28:08.866115 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.866093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7phz8\" (UID: \"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7\") " pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:28:08.891564 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.891544 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j9cg5\"" Apr 24 21:28:08.898011 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.897989 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m29p7\"" Apr 24 21:28:08.899913 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.899896 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5tnsp" Apr 24 21:28:08.906384 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.906361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l9zd5" Apr 24 21:28:08.921820 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.921759 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hhbdt\"" Apr 24 21:28:08.929660 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.929637 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7phz8" Apr 24 21:28:08.971055 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.969327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:28:08.978525 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:08.978364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/63a8c1d1-a439-488e-aad2-c0e97e0625ea-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-dv6rs\" (UID: \"63a8c1d1-a439-488e-aad2-c0e97e0625ea\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:28:09.042232 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.041991 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7fxnn\"" Apr 24 21:28:09.044756 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.044729 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" Apr 24 21:28:09.087083 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.087004 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" event={"ID":"fc4caba6-6af1-42ee-94d4-c7492907eb5a","Type":"ContainerStarted","Data":"8612d713f23c6d15581feafbfc53d19ace9f6af4ff44415d6968316e5a68c278"} Apr 24 21:28:09.090226 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.089533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" event={"ID":"3d2ccbb7-d5b5-4177-aab6-7905980c06f8","Type":"ContainerStarted","Data":"749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d"} Apr 24 21:28:09.090226 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.089589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" event={"ID":"3d2ccbb7-d5b5-4177-aab6-7905980c06f8","Type":"ContainerStarted","Data":"ba85af33803178db1c25aed868ed1a99ed30795da76ef944a34ba6393d80eb14"} Apr 24 21:28:09.090226 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.089717 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5tnsp"] Apr 24 21:28:09.090226 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.089971 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:28:09.092868 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.092134 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" event={"ID":"4eb7ba39-8089-4d1b-a005-3ecfaf77739a","Type":"ContainerStarted","Data":"136c249053fa9492253e5b09ea6d75ec67552898a94c624c9d7f89de7af2c850"} Apr 24 21:28:09.092868 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.092168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" event={"ID":"4eb7ba39-8089-4d1b-a005-3ecfaf77739a","Type":"ContainerStarted","Data":"1a7481e41580ecd57d31d5d8179f1274bdc335d734812e92f3651bebde46927e"} Apr 24 21:28:09.095763 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.095714 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" event={"ID":"b5880055-3142-40fe-9ab5-cee4fa3d85e7","Type":"ContainerStarted","Data":"30ee588ce3edf5590f7fd8720d5622bca20078aac20a27180a05437cbb6c3771"} Apr 24 21:28:09.120609 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.114985 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" podStartSLOduration=65.11491851 podStartE2EDuration="1m5.11491851s" podCreationTimestamp="2026-04-24 21:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:09.11375272 +0000 UTC m=+65.996738937" watchObservedRunningTime="2026-04-24 21:28:09.11491851 +0000 UTC m=+65.997904715" Apr 24 21:28:09.120977 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.120930 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l9zd5"] Apr 24 21:28:09.126679 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:28:09.126642 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58babfb5_74e8_4175_b89d_bec0e2b2ea46.slice/crio-2a054e7ea65289776fd19847800ed358bece8373431a5d67d041ebf0e96d5bbf WatchSource:0}: Error finding container 2a054e7ea65289776fd19847800ed358bece8373431a5d67d041ebf0e96d5bbf: Status 404 returned error can't find the container with id 2a054e7ea65289776fd19847800ed358bece8373431a5d67d041ebf0e96d5bbf Apr 24 21:28:09.138836 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.138782 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" podStartSLOduration=44.138765502 podStartE2EDuration="44.138765502s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:09.137813659 +0000 UTC m=+66.020799862" watchObservedRunningTime="2026-04-24 21:28:09.138765502 +0000 UTC m=+66.021751706" Apr 24 21:28:09.141100 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.141082 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7phz8"] Apr 24 21:28:09.143849 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:28:09.143825 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cf2c1f0_9ce8_4900_bae7_fdd5bfcb55e7.slice/crio-149cb78228605c0cd0b492cc568f2aee9daf5a60638c4660489506d0db6b7b9c WatchSource:0}: Error finding container 149cb78228605c0cd0b492cc568f2aee9daf5a60638c4660489506d0db6b7b9c: Status 404 returned error can't find the container with id 149cb78228605c0cd0b492cc568f2aee9daf5a60638c4660489506d0db6b7b9c Apr 24 21:28:09.199881 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.199806 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs"] Apr 24 21:28:09.202256 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:28:09.202222 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63a8c1d1_a439_488e_aad2_c0e97e0625ea.slice/crio-4360c64a07df6a27a99ba65178e8e5124c5009d06a8b00c93898f7e0b68e8870 WatchSource:0}: Error finding container 4360c64a07df6a27a99ba65178e8e5124c5009d06a8b00c93898f7e0b68e8870: Status 404 returned error can't find the container with id 4360c64a07df6a27a99ba65178e8e5124c5009d06a8b00c93898f7e0b68e8870 Apr 24 21:28:09.473088 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.473009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:28:09.475926 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.475899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c88b1b60-c919-439f-810d-ad2b2ecf4811-metrics-certs\") pod \"network-metrics-daemon-r5pbt\" (UID: \"c88b1b60-c919-439f-810d-ad2b2ecf4811\") " pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:28:09.677920 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.677863 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:28:09.681180 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.680958 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:28:09.763969 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.763715 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tshq6\"" Apr 24 21:28:09.771644 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.771378 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r5pbt" Apr 24 21:28:09.948088 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:09.948025 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r5pbt"] Apr 24 21:28:10.115614 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:10.115546 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7phz8" event={"ID":"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7","Type":"ContainerStarted","Data":"a6b8e30c0443858aae28151ea868730a7f71dfded88eeb536a9644d431c30a15"} Apr 24 21:28:10.115966 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:10.115918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7phz8" event={"ID":"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7","Type":"ContainerStarted","Data":"149cb78228605c0cd0b492cc568f2aee9daf5a60638c4660489506d0db6b7b9c"} Apr 24 21:28:10.118563 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:10.118533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" event={"ID":"63a8c1d1-a439-488e-aad2-c0e97e0625ea","Type":"ContainerStarted","Data":"4360c64a07df6a27a99ba65178e8e5124c5009d06a8b00c93898f7e0b68e8870"} Apr 24 21:28:10.120947 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:10.120919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5tnsp" event={"ID":"f06595a3-8f61-4c8e-93ae-03b5b752052a","Type":"ContainerStarted","Data":"229f3190491968211d95e90adf59357261d1e4c2535166e453f5a68bc607a11a"} Apr 24 21:28:10.123566 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:10.123542 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l9zd5" event={"ID":"58babfb5-74e8-4175-b89d-bec0e2b2ea46","Type":"ContainerStarted","Data":"2a054e7ea65289776fd19847800ed358bece8373431a5d67d041ebf0e96d5bbf"} Apr 24 21:28:10.124128 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:10.124111 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:28:10.125734 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:10.125714 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-54fb9fcb78-xvm5f" Apr 24 21:28:11.127269 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:11.127234 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r5pbt" event={"ID":"c88b1b60-c919-439f-810d-ad2b2ecf4811","Type":"ContainerStarted","Data":"0f19852c6d092ab6357c58100b42afa788474ca98d21d15e844088095f4efd30"} Apr 24 21:28:15.141479 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.141439 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" event={"ID":"63a8c1d1-a439-488e-aad2-c0e97e0625ea","Type":"ContainerStarted","Data":"0fd047e3c6905ff58e41bd6fec31dfe215d34a1f2df13ea4e488584eade9de90"} Apr 24 21:28:15.143022 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.142991 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5tnsp" event={"ID":"f06595a3-8f61-4c8e-93ae-03b5b752052a","Type":"ContainerStarted","Data":"294c9b73cfa2db7c1f85c1dd05317bdf926b15d47ba112f829f52f9a36ea8a75"} Apr 24 21:28:15.145057 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.144994 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l9zd5" event={"ID":"58babfb5-74e8-4175-b89d-bec0e2b2ea46","Type":"ContainerStarted","Data":"53bbb13cf870ea33e73d20ed63f673485b4a097311f66b5b60b1656b5b8e546c"} Apr 24 21:28:15.145057 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.145049 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l9zd5" event={"ID":"58babfb5-74e8-4175-b89d-bec0e2b2ea46","Type":"ContainerStarted","Data":"3af54e6125d25e853ce3a7c51c5c2296fd41b36bd3e4fb69439f685ac0595c20"} Apr 24 21:28:15.145252 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.145091 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-l9zd5" Apr 24 21:28:15.146699 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.146667 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" event={"ID":"b5880055-3142-40fe-9ab5-cee4fa3d85e7","Type":"ContainerStarted","Data":"a5c6c1d7c76e3a69eed1f2dc1ecfe79e0db959b952e1dfdd00d5bd5f800f157b"} Apr 24 21:28:15.148360 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.148335 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r5pbt" event={"ID":"c88b1b60-c919-439f-810d-ad2b2ecf4811","Type":"ContainerStarted","Data":"7252a2b0b20c1efa834005bcd9ad17ccf17bbdb37319185c4f9913fb6fe68cd6"} Apr 24 21:28:15.148483 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.148375 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r5pbt" event={"ID":"c88b1b60-c919-439f-810d-ad2b2ecf4811","Type":"ContainerStarted","Data":"65691e67919f69ace586eb675a39008a75afffe0ded7382149debafcc68a733a"} Apr 24 21:28:15.150104 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.150072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" event={"ID":"fc4caba6-6af1-42ee-94d4-c7492907eb5a","Type":"ContainerStarted","Data":"f3b6e8695a5e89c8852c1a938abda447d0f212dcfc005bb2f09d57d3b4e91985"} Apr 24 21:28:15.150104 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.150103 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" event={"ID":"fc4caba6-6af1-42ee-94d4-c7492907eb5a","Type":"ContainerStarted","Data":"c2492b414048f3d03e05f83755455615a2ba2bb9ef07b7b1a2dd9e41d12e1f8a"} Apr 24 21:28:15.151935 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.151911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7phz8" event={"ID":"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7","Type":"ContainerStarted","Data":"f2c80ed190e1bd409ce29972295d5b61833d1ed0fb0d49b2861bfbca7e430260"} Apr 24 21:28:15.160862 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.160808 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-dv6rs" podStartSLOduration=17.2202884 podStartE2EDuration="22.160794668s" podCreationTimestamp="2026-04-24 21:27:53 +0000 UTC" firstStartedPulling="2026-04-24 21:28:09.204881015 +0000 UTC m=+66.087867204" lastFinishedPulling="2026-04-24 21:28:14.145387282 +0000 UTC m=+71.028373472" observedRunningTime="2026-04-24 21:28:15.159013455 +0000 UTC m=+72.041999658" watchObservedRunningTime="2026-04-24 21:28:15.160794668 +0000 UTC m=+72.043780870" Apr 24 21:28:15.177632 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.177583 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-928mk" podStartSLOduration=44.867553862 podStartE2EDuration="50.177555025s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:08.835386846 +0000 UTC m=+65.718373031" lastFinishedPulling="2026-04-24 21:28:14.145388013 +0000 UTC m=+71.028374194" observedRunningTime="2026-04-24 21:28:15.176506804 +0000 UTC m=+72.059493006" watchObservedRunningTime="2026-04-24 21:28:15.177555025 +0000 UTC m=+72.060541230" Apr 24 21:28:15.197421 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.197382 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5tnsp" podStartSLOduration=34.149240067 podStartE2EDuration="39.197370105s" podCreationTimestamp="2026-04-24 21:27:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:09.097742786 +0000 UTC m=+65.980728971" lastFinishedPulling="2026-04-24 21:28:14.145872828 +0000 UTC m=+71.028859009" observedRunningTime="2026-04-24 21:28:15.196594506 +0000 UTC m=+72.079580712" watchObservedRunningTime="2026-04-24 21:28:15.197370105 +0000 UTC m=+72.080356308" Apr 24 21:28:15.224185 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.224138 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l9zd5" podStartSLOduration=34.208343053 podStartE2EDuration="39.224127042s" podCreationTimestamp="2026-04-24 21:27:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:09.130154858 +0000 UTC m=+66.013141037" lastFinishedPulling="2026-04-24 21:28:14.145938842 +0000 UTC m=+71.028925026" observedRunningTime="2026-04-24 21:28:15.223435659 +0000 UTC m=+72.106421855" watchObservedRunningTime="2026-04-24 21:28:15.224127042 +0000 UTC m=+72.107113244" Apr 24 21:28:15.242778 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.242718 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-5r28j" podStartSLOduration=44.928055607 podStartE2EDuration="50.242704057s" podCreationTimestamp="2026-04-24 21:27:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:08.830849633 +0000 UTC m=+65.713835815" lastFinishedPulling="2026-04-24 21:28:14.145498078 +0000 UTC m=+71.028484265" observedRunningTime="2026-04-24 21:28:15.241945198 +0000 UTC m=+72.124931401" watchObservedRunningTime="2026-04-24 21:28:15.242704057 +0000 UTC m=+72.125690260" Apr 24 21:28:15.266032 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.265986 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-r5pbt" podStartSLOduration=67.832264981 podStartE2EDuration="1m12.26597117s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:28:10.111423451 +0000 UTC m=+66.994409639" lastFinishedPulling="2026-04-24 21:28:14.545129644 +0000 UTC m=+71.428115828" observedRunningTime="2026-04-24 21:28:15.265914615 +0000 UTC m=+72.148900810" watchObservedRunningTime="2026-04-24 21:28:15.26597117 +0000 UTC m=+72.148957373" Apr 24 21:28:15.703634 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:15.703605 2577 scope.go:117] "RemoveContainer" containerID="6fabb25e684378baa65e513d7ab20ecb00bef9a33a7310c552f4c6188505a595" Apr 24 21:28:16.156942 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.156918 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:28:16.157355 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.157010 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" event={"ID":"07997d80-1c4d-46a5-a441-0ac4b389addb","Type":"ContainerStarted","Data":"5cc72d8cd09b0b92e3b453af6a97b377e88df3e916e843c0665ecd2a50d7039e"} Apr 24 21:28:16.335565 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.335531 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx"] Apr 24 21:28:16.355183 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.355148 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx"] Apr 24 21:28:16.355387 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.355307 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" Apr 24 21:28:16.358405 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.358372 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:28:16.358550 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.358405 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-kq6gw\"" Apr 24 21:28:16.536564 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.536539 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/556119d8-8806-465d-a584-a78643b77ef6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xhsfx\" (UID: \"556119d8-8806-465d-a584-a78643b77ef6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" Apr 24 21:28:16.637381 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.637343 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/556119d8-8806-465d-a584-a78643b77ef6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xhsfx\" (UID: \"556119d8-8806-465d-a584-a78643b77ef6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" Apr 24 21:28:16.639784 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.639764 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/556119d8-8806-465d-a584-a78643b77ef6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xhsfx\" (UID: \"556119d8-8806-465d-a584-a78643b77ef6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" Apr 24 21:28:16.667270 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.667243 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" Apr 24 21:28:16.804662 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.804471 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx"] Apr 24 21:28:16.807150 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:28:16.807111 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod556119d8_8806_465d_a584_a78643b77ef6.slice/crio-0c6009c129fe476abf2ec7a2debdf05d026518e49e6ed1edf7adfdbf08c69390 WatchSource:0}: Error finding container 0c6009c129fe476abf2ec7a2debdf05d026518e49e6ed1edf7adfdbf08c69390: Status 404 returned error can't find the container with id 0c6009c129fe476abf2ec7a2debdf05d026518e49e6ed1edf7adfdbf08c69390 Apr 24 21:28:16.915324 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.915236 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-m4mnm"] Apr 24 21:28:16.952378 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.952348 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m4mnm"] Apr 24 21:28:16.952561 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.952480 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m4mnm" Apr 24 21:28:16.955213 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.955189 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:28:16.955337 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.955267 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-l469x\"" Apr 24 21:28:16.955500 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:16.955475 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:28:17.016338 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:17.016297 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" podUID="3273bf0f-5e81-4792-af04-76455d6aa3a1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:28:17.040111 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:17.040079 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txg6g\" (UniqueName: \"kubernetes.io/projected/ccbbba6b-8306-4a87-b893-3355fbae99d6-kube-api-access-txg6g\") pod \"downloads-6bcc868b7-m4mnm\" (UID: \"ccbbba6b-8306-4a87-b893-3355fbae99d6\") " pod="openshift-console/downloads-6bcc868b7-m4mnm" Apr 24 21:28:17.141183 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:17.141151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txg6g\" (UniqueName: \"kubernetes.io/projected/ccbbba6b-8306-4a87-b893-3355fbae99d6-kube-api-access-txg6g\") pod \"downloads-6bcc868b7-m4mnm\" (UID: \"ccbbba6b-8306-4a87-b893-3355fbae99d6\") " pod="openshift-console/downloads-6bcc868b7-m4mnm" Apr 24 21:28:17.160806 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:17.160760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" event={"ID":"556119d8-8806-465d-a584-a78643b77ef6","Type":"ContainerStarted","Data":"0c6009c129fe476abf2ec7a2debdf05d026518e49e6ed1edf7adfdbf08c69390"} Apr 24 21:28:17.162726 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:17.162693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7phz8" event={"ID":"9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7","Type":"ContainerStarted","Data":"709e3f59c74bcc82185f2f24af7bc11ebfda3fe3f466f912d0409117a8e7125d"} Apr 24 21:28:17.162848 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:17.162806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txg6g\" (UniqueName: \"kubernetes.io/projected/ccbbba6b-8306-4a87-b893-3355fbae99d6-kube-api-access-txg6g\") pod \"downloads-6bcc868b7-m4mnm\" (UID: \"ccbbba6b-8306-4a87-b893-3355fbae99d6\") " pod="openshift-console/downloads-6bcc868b7-m4mnm" Apr 24 21:28:17.264595 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:17.264489 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m4mnm" Apr 24 21:28:17.417839 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:17.417781 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7phz8" podStartSLOduration=18.186659803 podStartE2EDuration="25.417757715s" podCreationTimestamp="2026-04-24 21:27:52 +0000 UTC" firstStartedPulling="2026-04-24 21:28:09.216739203 +0000 UTC m=+66.099725387" lastFinishedPulling="2026-04-24 21:28:16.447837116 +0000 UTC m=+73.330823299" observedRunningTime="2026-04-24 21:28:17.203850508 +0000 UTC m=+74.086836742" watchObservedRunningTime="2026-04-24 21:28:17.417757715 +0000 UTC m=+74.300743922" Apr 24 21:28:17.418911 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:17.418862 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m4mnm"] Apr 24 21:28:17.421638 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:28:17.421609 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccbbba6b_8306_4a87_b893_3355fbae99d6.slice/crio-807c76756d6cd74d82fd14ed4b61bf46248e84cf800d64a4fa100d9d6e8bc579 WatchSource:0}: Error finding container 807c76756d6cd74d82fd14ed4b61bf46248e84cf800d64a4fa100d9d6e8bc579: Status 404 returned error can't find the container with id 807c76756d6cd74d82fd14ed4b61bf46248e84cf800d64a4fa100d9d6e8bc579 Apr 24 21:28:18.172828 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:18.172785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m4mnm" event={"ID":"ccbbba6b-8306-4a87-b893-3355fbae99d6","Type":"ContainerStarted","Data":"807c76756d6cd74d82fd14ed4b61bf46248e84cf800d64a4fa100d9d6e8bc579"} Apr 24 21:28:19.177762 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:19.177716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" event={"ID":"556119d8-8806-465d-a584-a78643b77ef6","Type":"ContainerStarted","Data":"ec1b0eafe987c7cf94f54194b59a88aef84423f529e4045ddcbcf144ac6159ba"} Apr 24 21:28:19.178280 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:19.177996 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" Apr 24 21:28:19.184251 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:19.184226 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" Apr 24 21:28:19.197730 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:19.197686 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xhsfx" podStartSLOduration=1.769310662 podStartE2EDuration="3.197673535s" podCreationTimestamp="2026-04-24 21:28:16 +0000 UTC" firstStartedPulling="2026-04-24 21:28:16.809179449 +0000 UTC m=+73.692165639" lastFinishedPulling="2026-04-24 21:28:18.237542327 +0000 UTC m=+75.120528512" observedRunningTime="2026-04-24 21:28:19.195603546 +0000 UTC m=+76.078589750" watchObservedRunningTime="2026-04-24 21:28:19.197673535 +0000 UTC m=+76.080659738" Apr 24 21:28:23.026081 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:23.025945 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-g2stj" Apr 24 21:28:25.160990 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.160957 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l9zd5" Apr 24 21:28:25.442004 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.441703 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7kw28"] Apr 24 21:28:25.447041 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.447012 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.450356 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.450331 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5pldf\"" Apr 24 21:28:25.450609 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.450340 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:28:25.450764 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.450412 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:28:25.450824 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.450418 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:28:25.456001 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.455981 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:28:25.509114 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.509082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.509272 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.509131 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-accelerators-collector-config\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.509272 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.509163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-tls\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.509272 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.509197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c4b1473-2725-47e1-827e-1afaa4460517-sys\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.509272 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.509227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-textfile\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.509272 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.509252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-wtmp\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.509516 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.509276 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c4b1473-2725-47e1-827e-1afaa4460517-metrics-client-ca\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.509516 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.509317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c4b1473-2725-47e1-827e-1afaa4460517-root\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.509516 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.509378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrh5k\" (UniqueName: \"kubernetes.io/projected/5c4b1473-2725-47e1-827e-1afaa4460517-kube-api-access-jrh5k\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.610096 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-tls\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.610283 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c4b1473-2725-47e1-827e-1afaa4460517-sys\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.610283 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-textfile\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.610283 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-wtmp\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.610283 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c4b1473-2725-47e1-827e-1afaa4460517-metrics-client-ca\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.610283 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c4b1473-2725-47e1-827e-1afaa4460517-root\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.610527 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610284 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrh5k\" (UniqueName: \"kubernetes.io/projected/5c4b1473-2725-47e1-827e-1afaa4460517-kube-api-access-jrh5k\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.610527 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610353 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.610527 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-accelerators-collector-config\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.611345 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-wtmp\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.611345 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c4b1473-2725-47e1-827e-1afaa4460517-sys\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.611345 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.610982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5c4b1473-2725-47e1-827e-1afaa4460517-root\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.611345 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.611028 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-accelerators-collector-config\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.611559 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.611451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c4b1473-2725-47e1-827e-1afaa4460517-metrics-client-ca\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.611764 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.611701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-textfile\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.613937 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.613917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.617882 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.617841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5c4b1473-2725-47e1-827e-1afaa4460517-node-exporter-tls\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.621834 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.621813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrh5k\" (UniqueName: \"kubernetes.io/projected/5c4b1473-2725-47e1-827e-1afaa4460517-kube-api-access-jrh5k\") pod \"node-exporter-7kw28\" (UID: \"5c4b1473-2725-47e1-827e-1afaa4460517\") " pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.763807 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:25.763721 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7kw28" Apr 24 21:28:25.777458 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:28:25.777419 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c4b1473_2725_47e1_827e_1afaa4460517.slice/crio-a46a15014c744d8e842d95009495c0d98ec06f5144767c948bfad5af26981674 WatchSource:0}: Error finding container a46a15014c744d8e842d95009495c0d98ec06f5144767c948bfad5af26981674: Status 404 returned error can't find the container with id a46a15014c744d8e842d95009495c0d98ec06f5144767c948bfad5af26981674 Apr 24 21:28:26.158149 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:26.158118 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:28:26.164911 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:26.164886 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-pjgmr" Apr 24 21:28:26.207616 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:26.207166 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7kw28" event={"ID":"5c4b1473-2725-47e1-827e-1afaa4460517","Type":"ContainerStarted","Data":"a46a15014c744d8e842d95009495c0d98ec06f5144767c948bfad5af26981674"} Apr 24 21:28:27.016598 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:27.016544 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" podUID="3273bf0f-5e81-4792-af04-76455d6aa3a1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:28:27.016787 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:27.016629 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" Apr 24 21:28:27.017304 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:27.017258 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"d640b41c9a62eda82c91c6e26077046e1998c0441d66f25b8d23c47049034006"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:28:27.017432 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:27.017333 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" podUID="3273bf0f-5e81-4792-af04-76455d6aa3a1" containerName="service-proxy" containerID="cri-o://d640b41c9a62eda82c91c6e26077046e1998c0441d66f25b8d23c47049034006" gracePeriod=30 Apr 24 21:28:28.602112 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:28.602014 2577 patch_prober.go:28] interesting pod/image-registry-76b48cbfb7-6htxn container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:28:28.602112 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:28.602073 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" podUID="3d2ccbb7-d5b5-4177-aab6-7905980c06f8" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:28:30.128206 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:30.128133 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:28:34.238463 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:34.238409 2577 generic.go:358] "Generic (PLEG): container finished" podID="3273bf0f-5e81-4792-af04-76455d6aa3a1" containerID="d640b41c9a62eda82c91c6e26077046e1998c0441d66f25b8d23c47049034006" exitCode=2 Apr 24 21:28:34.238980 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:34.238490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" event={"ID":"3273bf0f-5e81-4792-af04-76455d6aa3a1","Type":"ContainerDied","Data":"d640b41c9a62eda82c91c6e26077046e1998c0441d66f25b8d23c47049034006"} Apr 24 21:28:34.238980 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:34.238537 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55b75bb794-j2j5t" event={"ID":"3273bf0f-5e81-4792-af04-76455d6aa3a1","Type":"ContainerStarted","Data":"699c1a5e26c0617d8448c9b5f0e3b5d158925c902cd8992daced2d0bdcfd0fcf"} Apr 24 21:28:34.240235 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:34.240210 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m4mnm" event={"ID":"ccbbba6b-8306-4a87-b893-3355fbae99d6","Type":"ContainerStarted","Data":"a335aeee2f08064307171f87e700a7c8417f734700caba8c706738fc0e7315d9"} Apr 24 21:28:34.240381 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:34.240367 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-m4mnm" Apr 24 21:28:34.242080 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:34.242047 2577 generic.go:358] "Generic (PLEG): container finished" podID="5c4b1473-2725-47e1-827e-1afaa4460517" containerID="59987eb497a534c2615b6457dfd6c5a9f6880185677ed0e17c3fd6a6762db94a" exitCode=0 Apr 24 21:28:34.242194 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:34.242158 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7kw28" event={"ID":"5c4b1473-2725-47e1-827e-1afaa4460517","Type":"ContainerDied","Data":"59987eb497a534c2615b6457dfd6c5a9f6880185677ed0e17c3fd6a6762db94a"} Apr 24 21:28:34.257637 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:34.257613 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-m4mnm" Apr 24 21:28:34.298636 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:34.298583 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-m4mnm" podStartSLOduration=2.209332153 podStartE2EDuration="18.298549339s" podCreationTimestamp="2026-04-24 21:28:16 +0000 UTC" firstStartedPulling="2026-04-24 21:28:17.423380151 +0000 UTC m=+74.306366331" lastFinishedPulling="2026-04-24 21:28:33.512597324 +0000 UTC m=+90.395583517" observedRunningTime="2026-04-24 21:28:34.297129294 +0000 UTC m=+91.180115495" watchObservedRunningTime="2026-04-24 21:28:34.298549339 +0000 UTC m=+91.181535542" Apr 24 21:28:35.247791 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:35.247747 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7kw28" event={"ID":"5c4b1473-2725-47e1-827e-1afaa4460517","Type":"ContainerStarted","Data":"d357436617f548a6be758df6f8ee45a21eba444ec5c7d64daea20fe272d4d09d"} Apr 24 21:28:35.248242 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:35.247800 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7kw28" event={"ID":"5c4b1473-2725-47e1-827e-1afaa4460517","Type":"ContainerStarted","Data":"891efe76b493400de1b3d8eda6c40544c14d147b514ef406b85d7feb78dfedb2"} Apr 24 21:28:35.269670 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:35.269606 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7kw28" podStartSLOduration=2.633915921 podStartE2EDuration="10.269585932s" podCreationTimestamp="2026-04-24 21:28:25 +0000 UTC" firstStartedPulling="2026-04-24 21:28:25.779639601 +0000 UTC m=+82.662625796" lastFinishedPulling="2026-04-24 21:28:33.415309614 +0000 UTC m=+90.298295807" observedRunningTime="2026-04-24 21:28:35.26712576 +0000 UTC m=+92.150111974" watchObservedRunningTime="2026-04-24 21:28:35.269585932 +0000 UTC m=+92.152572129" Apr 24 21:28:39.683462 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:28:39.683426 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76b48cbfb7-6htxn"] Apr 24 21:29:01.326596 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:01.326540 2577 generic.go:358] "Generic (PLEG): container finished" podID="e5e1e857-05a9-48ac-ae67-6ad2198bbcf7" containerID="7d2dc396267f2232499706be5f2c2ec05ba3af27581648555d85ee4ea9e73014" exitCode=0 Apr 24 21:29:01.326977 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:01.326616 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r269t" event={"ID":"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7","Type":"ContainerDied","Data":"7d2dc396267f2232499706be5f2c2ec05ba3af27581648555d85ee4ea9e73014"} Apr 24 21:29:01.326977 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:01.326938 2577 scope.go:117] "RemoveContainer" containerID="7d2dc396267f2232499706be5f2c2ec05ba3af27581648555d85ee4ea9e73014" Apr 24 21:29:02.331455 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:02.331422 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r269t" event={"ID":"e5e1e857-05a9-48ac-ae67-6ad2198bbcf7","Type":"ContainerStarted","Data":"550e64b015a33fa6f8e6d8aa60dfa6ae0003c5babe2c9ded1c19e9118e98b796"} Apr 24 21:29:04.706314 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:04.706254 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" podUID="3d2ccbb7-d5b5-4177-aab6-7905980c06f8" containerName="registry" containerID="cri-o://749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d" gracePeriod=30 Apr 24 21:29:04.949396 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:04.949374 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:29:05.051630 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.051534 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-image-registry-private-configuration\") pod \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " Apr 24 21:29:05.051630 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.051588 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") pod \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " Apr 24 21:29:05.051857 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.051639 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5kt2\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-kube-api-access-m5kt2\") pod \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " Apr 24 21:29:05.051857 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.051731 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-trusted-ca\") pod \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " Apr 24 21:29:05.051857 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.051789 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-bound-sa-token\") pod \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " Apr 24 21:29:05.051857 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.051839 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-ca-trust-extracted\") pod \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " Apr 24 21:29:05.052041 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.051867 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-installation-pull-secrets\") pod \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " Apr 24 21:29:05.052041 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.051913 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-certificates\") pod \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\" (UID: \"3d2ccbb7-d5b5-4177-aab6-7905980c06f8\") " Apr 24 21:29:05.052254 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.052189 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3d2ccbb7-d5b5-4177-aab6-7905980c06f8" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:05.052530 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.052460 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3d2ccbb7-d5b5-4177-aab6-7905980c06f8" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:05.054052 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.054030 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3d2ccbb7-d5b5-4177-aab6-7905980c06f8" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:05.054351 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.054316 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3d2ccbb7-d5b5-4177-aab6-7905980c06f8" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:05.054351 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.054338 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-kube-api-access-m5kt2" (OuterVolumeSpecName: "kube-api-access-m5kt2") pod "3d2ccbb7-d5b5-4177-aab6-7905980c06f8" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8"). InnerVolumeSpecName "kube-api-access-m5kt2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:05.054633 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.054608 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3d2ccbb7-d5b5-4177-aab6-7905980c06f8" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:05.054693 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.054640 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3d2ccbb7-d5b5-4177-aab6-7905980c06f8" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:05.060000 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.059968 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3d2ccbb7-d5b5-4177-aab6-7905980c06f8" (UID: "3d2ccbb7-d5b5-4177-aab6-7905980c06f8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:05.152905 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.152875 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-bound-sa-token\") on node \"ip-10-0-136-65.ec2.internal\" DevicePath \"\"" Apr 24 21:29:05.152905 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.152901 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-ca-trust-extracted\") on node \"ip-10-0-136-65.ec2.internal\" DevicePath \"\"" Apr 24 21:29:05.152905 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.152912 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-installation-pull-secrets\") on node \"ip-10-0-136-65.ec2.internal\" DevicePath \"\"" Apr 24 21:29:05.153201 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.152922 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-certificates\") on node \"ip-10-0-136-65.ec2.internal\" DevicePath \"\"" Apr 24 21:29:05.153201 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.152933 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-image-registry-private-configuration\") on node \"ip-10-0-136-65.ec2.internal\" DevicePath \"\"" Apr 24 21:29:05.153201 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.152943 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-registry-tls\") on node \"ip-10-0-136-65.ec2.internal\" DevicePath \"\"" Apr 24 21:29:05.153201 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.152952 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5kt2\" (UniqueName: \"kubernetes.io/projected/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-kube-api-access-m5kt2\") on node \"ip-10-0-136-65.ec2.internal\" DevicePath \"\"" Apr 24 21:29:05.153201 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.152960 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2ccbb7-d5b5-4177-aab6-7905980c06f8-trusted-ca\") on node \"ip-10-0-136-65.ec2.internal\" DevicePath \"\"" Apr 24 21:29:05.346031 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.345996 2577 generic.go:358] "Generic (PLEG): container finished" podID="3d2ccbb7-d5b5-4177-aab6-7905980c06f8" containerID="749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d" exitCode=0 Apr 24 21:29:05.346216 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.346057 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" Apr 24 21:29:05.346216 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.346092 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" event={"ID":"3d2ccbb7-d5b5-4177-aab6-7905980c06f8","Type":"ContainerDied","Data":"749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d"} Apr 24 21:29:05.346216 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.346140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76b48cbfb7-6htxn" event={"ID":"3d2ccbb7-d5b5-4177-aab6-7905980c06f8","Type":"ContainerDied","Data":"ba85af33803178db1c25aed868ed1a99ed30795da76ef944a34ba6393d80eb14"} Apr 24 21:29:05.346216 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.346161 2577 scope.go:117] "RemoveContainer" containerID="749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d" Apr 24 21:29:05.362561 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.362542 2577 scope.go:117] "RemoveContainer" containerID="749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d" Apr 24 21:29:05.362921 ip-10-0-136-65 kubenswrapper[2577]: E0424 21:29:05.362901 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d\": container with ID starting with 749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d not found: ID does not exist" containerID="749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d" Apr 24 21:29:05.362962 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.362929 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d"} err="failed to get container status \"749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d\": rpc error: code = NotFound desc = could not find container \"749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d\": container with ID starting with 749df4717157489f332d232efce1afe9e6b4e29c2b8fed393db32822c097168d not found: ID does not exist" Apr 24 21:29:05.373414 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.373393 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-76b48cbfb7-6htxn"] Apr 24 21:29:05.379159 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.379140 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-76b48cbfb7-6htxn"] Apr 24 21:29:05.702404 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:05.702328 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d2ccbb7-d5b5-4177-aab6-7905980c06f8" path="/var/lib/kubelet/pods/3d2ccbb7-d5b5-4177-aab6-7905980c06f8/volumes" Apr 24 21:29:17.384250 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:17.384214 2577 generic.go:358] "Generic (PLEG): container finished" podID="b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba" containerID="b846c153f66105cb7ad71880d4285e0e9dabf01e79e675e00334d35a7897c704" exitCode=0 Apr 24 21:29:17.384811 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:17.384293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" event={"ID":"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba","Type":"ContainerDied","Data":"b846c153f66105cb7ad71880d4285e0e9dabf01e79e675e00334d35a7897c704"} Apr 24 21:29:17.384811 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:17.384764 2577 scope.go:117] "RemoveContainer" containerID="b846c153f66105cb7ad71880d4285e0e9dabf01e79e675e00334d35a7897c704" Apr 24 21:29:17.385757 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:17.385734 2577 generic.go:358] "Generic (PLEG): container finished" podID="392b6d2c-0064-4bb6-b65f-6bef7161bc01" containerID="7373fcc537799d0de9855a3c6acd35cef1f80f6a7d7a2741b162bc97fdb983ad" exitCode=0 Apr 24 21:29:17.385858 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:17.385792 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" event={"ID":"392b6d2c-0064-4bb6-b65f-6bef7161bc01","Type":"ContainerDied","Data":"7373fcc537799d0de9855a3c6acd35cef1f80f6a7d7a2741b162bc97fdb983ad"} Apr 24 21:29:17.386062 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:17.386046 2577 scope.go:117] "RemoveContainer" containerID="7373fcc537799d0de9855a3c6acd35cef1f80f6a7d7a2741b162bc97fdb983ad" Apr 24 21:29:18.390558 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:18.390520 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cj6t5" event={"ID":"392b6d2c-0064-4bb6-b65f-6bef7161bc01","Type":"ContainerStarted","Data":"352922d5cb428fbfb5353b36325e79761e28cbf5c8fcfc88b1a7cb930a8d1078"} Apr 24 21:29:18.392141 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:29:18.392116 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nklzr" event={"ID":"b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba","Type":"ContainerStarted","Data":"e0f2648369fbcb2a7042d0a56ffef76ca9acf7571c45adbda37f033252057253"} Apr 24 21:32:03.584395 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:32:03.584363 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:32:03.585674 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:32:03.585650 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:32:03.593185 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:32:03.593160 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:37:03.606651 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:37:03.606619 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:37:03.607287 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:37:03.607270 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:42:03.627209 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:42:03.627128 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:42:03.628225 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:42:03.628201 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:47:03.652396 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:47:03.648716 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:47:03.652396 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:47:03.649948 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:48:42.308165 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:42.308136 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-49ln2_e4eb2c6b-a416-422e-a95a-e7759eca39e8/global-pull-secret-syncer/0.log" Apr 24 21:48:42.400054 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:42.400023 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7mlvd_94182b7e-3409-484c-82ea-df615ef6141e/konnectivity-agent/0.log" Apr 24 21:48:42.508697 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:42.508669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-65.ec2.internal_9fc64b0f94b5e05651be58c19d8e03a7/haproxy/0.log" Apr 24 21:48:45.768139 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:45.768105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-5r28j_b5880055-3142-40fe-9ab5-cee4fa3d85e7/cluster-monitoring-operator/0.log" Apr 24 21:48:45.951929 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:45.951897 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7kw28_5c4b1473-2725-47e1-827e-1afaa4460517/node-exporter/0.log" Apr 24 21:48:45.979138 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:45.979115 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7kw28_5c4b1473-2725-47e1-827e-1afaa4460517/kube-rbac-proxy/0.log" Apr 24 21:48:46.003996 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:46.003969 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7kw28_5c4b1473-2725-47e1-827e-1afaa4460517/init-textfile/0.log" Apr 24 21:48:46.558718 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:46.558689 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-xhsfx_556119d8-8806-465d-a584-a78643b77ef6/prometheus-operator-admission-webhook/0.log" Apr 24 21:48:48.073557 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:48.073517 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-dv6rs_63a8c1d1-a439-488e-aad2-c0e97e0625ea/networking-console-plugin/0.log" Apr 24 21:48:48.560981 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:48.560927 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/1.log" Apr 24 21:48:48.565566 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:48.565539 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjgmr_07997d80-1c4d-46a5-a441-0ac4b389addb/console-operator/2.log" Apr 24 21:48:48.999718 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:48.999692 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-m4mnm_ccbbba6b-8306-4a87-b893-3355fbae99d6/download-server/0.log" Apr 24 21:48:49.448050 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.448008 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs"] Apr 24 21:48:49.448411 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.448356 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d2ccbb7-d5b5-4177-aab6-7905980c06f8" containerName="registry" Apr 24 21:48:49.448411 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.448368 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2ccbb7-d5b5-4177-aab6-7905980c06f8" containerName="registry" Apr 24 21:48:49.448479 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.448433 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d2ccbb7-d5b5-4177-aab6-7905980c06f8" containerName="registry" Apr 24 21:48:49.451466 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.451450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.454218 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.454190 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ngn69\"/\"kube-root-ca.crt\"" Apr 24 21:48:49.455301 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.455284 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-ngn69\"/\"default-dockercfg-b5l6f\"" Apr 24 21:48:49.455364 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.455284 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ngn69\"/\"openshift-service-ca.crt\"" Apr 24 21:48:49.463308 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.463286 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs"] Apr 24 21:48:49.492424 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.492399 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-g6hwh_557745bb-3785-45f5-8eed-774938893b62/volume-data-source-validator/0.log" Apr 24 21:48:49.497010 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.496979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-sys\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.497123 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.497036 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-podres\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.497123 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.497072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9hwm\" (UniqueName: \"kubernetes.io/projected/697ffd39-7e9c-41de-af4d-50d20e097e74-kube-api-access-q9hwm\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.497197 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.497160 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-proc\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.497244 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.497204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-lib-modules\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.597688 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.597646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-podres\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.597688 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.597698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9hwm\" (UniqueName: \"kubernetes.io/projected/697ffd39-7e9c-41de-af4d-50d20e097e74-kube-api-access-q9hwm\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.597905 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.597737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-proc\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.597905 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.597770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-lib-modules\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.597905 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.597791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-sys\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.597905 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.597820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-podres\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.597905 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.597866 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-sys\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.598083 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.597939 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-proc\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.598083 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.598017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/697ffd39-7e9c-41de-af4d-50d20e097e74-lib-modules\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.606703 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.606681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9hwm\" (UniqueName: \"kubernetes.io/projected/697ffd39-7e9c-41de-af4d-50d20e097e74-kube-api-access-q9hwm\") pod \"perf-node-gather-daemonset-krmhs\" (UID: \"697ffd39-7e9c-41de-af4d-50d20e097e74\") " pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.761180 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.761092 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:49.882329 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.882293 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs"] Apr 24 21:48:49.885282 ip-10-0-136-65 kubenswrapper[2577]: W0424 21:48:49.885252 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod697ffd39_7e9c_41de_af4d_50d20e097e74.slice/crio-9a7ab2694b21ba9154db13984dbeaf8bda55dcc9d6bac0bd2e79a6ce748e7dc2 WatchSource:0}: Error finding container 9a7ab2694b21ba9154db13984dbeaf8bda55dcc9d6bac0bd2e79a6ce748e7dc2: Status 404 returned error can't find the container with id 9a7ab2694b21ba9154db13984dbeaf8bda55dcc9d6bac0bd2e79a6ce748e7dc2 Apr 24 21:48:49.886822 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:49.886806 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:48:50.249409 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:50.249383 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l9zd5_58babfb5-74e8-4175-b89d-bec0e2b2ea46/dns/0.log" Apr 24 21:48:50.271513 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:50.271488 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l9zd5_58babfb5-74e8-4175-b89d-bec0e2b2ea46/kube-rbac-proxy/0.log" Apr 24 21:48:50.368753 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:50.368721 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s74k7_23aa9667-569b-4627-bc38-54b145825a25/dns-node-resolver/0.log" Apr 24 21:48:50.776359 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:50.776315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" event={"ID":"697ffd39-7e9c-41de-af4d-50d20e097e74","Type":"ContainerStarted","Data":"1e35663f80ee3b6093cfd5c26c4df7149b26a591d471416d0c3cfb2d02b9fd59"} Apr 24 21:48:50.776359 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:50.776364 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" event={"ID":"697ffd39-7e9c-41de-af4d-50d20e097e74","Type":"ContainerStarted","Data":"9a7ab2694b21ba9154db13984dbeaf8bda55dcc9d6bac0bd2e79a6ce748e7dc2"} Apr 24 21:48:50.776862 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:50.776394 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:50.794967 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:50.794926 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" podStartSLOduration=1.794912794 podStartE2EDuration="1.794912794s" podCreationTimestamp="2026-04-24 21:48:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:50.792508286 +0000 UTC m=+1307.675494487" watchObservedRunningTime="2026-04-24 21:48:50.794912794 +0000 UTC m=+1307.677898995" Apr 24 21:48:50.856749 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:50.856712 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-672zj_d5617082-ad48-4271-8c23-19c149807eba/node-ca/0.log" Apr 24 21:48:51.656144 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:51.656113 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-54fb9fcb78-xvm5f_4eb7ba39-8089-4d1b-a005-3ecfaf77739a/router/0.log" Apr 24 21:48:52.032928 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:52.032850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5tnsp_f06595a3-8f61-4c8e-93ae-03b5b752052a/serve-healthcheck-canary/0.log" Apr 24 21:48:52.480786 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:52.480758 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-r269t_e5e1e857-05a9-48ac-ae67-6ad2198bbcf7/insights-operator/0.log" Apr 24 21:48:52.480999 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:52.480983 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-r269t_e5e1e857-05a9-48ac-ae67-6ad2198bbcf7/insights-operator/1.log" Apr 24 21:48:52.577972 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:52.577944 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7phz8_9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7/kube-rbac-proxy/0.log" Apr 24 21:48:52.606172 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:52.606146 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7phz8_9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7/exporter/0.log" Apr 24 21:48:52.630195 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:52.630169 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7phz8_9cf2c1f0-9ce8-4900-bae7-fdd5bfcb55e7/extractor/0.log" Apr 24 21:48:56.788308 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:56.788282 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-ngn69/perf-node-gather-daemonset-krmhs" Apr 24 21:48:59.220762 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:59.220735 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lslhr_befbf90c-3ad0-4580-95b5-aab6da3d5a4d/migrator/0.log" Apr 24 21:48:59.245833 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:59.245810 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lslhr_befbf90c-3ad0-4580-95b5-aab6da3d5a4d/graceful-termination/0.log" Apr 24 21:48:59.617258 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:59.617230 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-nklzr_b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba/kube-storage-version-migrator-operator/1.log" Apr 24 21:48:59.618096 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:48:59.618080 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-nklzr_b5480cfe-1e95-43c8-a907-f0b5c7dcb7ba/kube-storage-version-migrator-operator/0.log" Apr 24 21:49:00.848370 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:00.848344 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pvrrx_44478f44-ad28-4f73-9fd4-429d584502ef/kube-multus-additional-cni-plugins/0.log" Apr 24 21:49:00.875113 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:00.875047 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pvrrx_44478f44-ad28-4f73-9fd4-429d584502ef/egress-router-binary-copy/0.log" Apr 24 21:49:00.902867 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:00.902841 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pvrrx_44478f44-ad28-4f73-9fd4-429d584502ef/cni-plugins/0.log" Apr 24 21:49:00.938976 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:00.938948 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pvrrx_44478f44-ad28-4f73-9fd4-429d584502ef/bond-cni-plugin/0.log" Apr 24 21:49:00.969408 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:00.969384 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pvrrx_44478f44-ad28-4f73-9fd4-429d584502ef/routeoverride-cni/0.log" Apr 24 21:49:00.994112 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:00.994087 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pvrrx_44478f44-ad28-4f73-9fd4-429d584502ef/whereabouts-cni-bincopy/0.log" Apr 24 21:49:01.029527 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:01.029499 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-pvrrx_44478f44-ad28-4f73-9fd4-429d584502ef/whereabouts-cni/0.log" Apr 24 21:49:01.338287 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:01.338258 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wl7mb_f6849759-b993-4293-a216-c7f7861f1c3f/kube-multus/0.log" Apr 24 21:49:01.530020 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:01.529990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r5pbt_c88b1b60-c919-439f-810d-ad2b2ecf4811/network-metrics-daemon/0.log" Apr 24 21:49:01.556708 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:01.556677 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r5pbt_c88b1b60-c919-439f-810d-ad2b2ecf4811/kube-rbac-proxy/0.log" Apr 24 21:49:03.004900 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:03.004869 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lvpcz_8e9546e8-52dd-4b70-a206-29bd990eb383/ovn-controller/0.log" Apr 24 21:49:03.030702 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:03.030674 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lvpcz_8e9546e8-52dd-4b70-a206-29bd990eb383/ovn-acl-logging/0.log" Apr 24 21:49:03.063659 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:03.063615 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lvpcz_8e9546e8-52dd-4b70-a206-29bd990eb383/kube-rbac-proxy-node/0.log" Apr 24 21:49:03.086190 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:03.086162 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lvpcz_8e9546e8-52dd-4b70-a206-29bd990eb383/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 21:49:03.106499 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:03.106476 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lvpcz_8e9546e8-52dd-4b70-a206-29bd990eb383/northd/0.log" Apr 24 21:49:03.128192 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:03.128167 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lvpcz_8e9546e8-52dd-4b70-a206-29bd990eb383/nbdb/0.log" Apr 24 21:49:03.154695 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:03.154672 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lvpcz_8e9546e8-52dd-4b70-a206-29bd990eb383/sbdb/0.log" Apr 24 21:49:03.252076 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:03.252046 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lvpcz_8e9546e8-52dd-4b70-a206-29bd990eb383/ovnkube-controller/0.log" Apr 24 21:49:04.246032 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:04.246000 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-5zhmk_a5af1e3f-499d-4135-9910-eec6dffebf8e/check-endpoints/0.log" Apr 24 21:49:04.297686 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:04.297658 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-g2stj_0c5854d5-4980-4604-9aa1-a757c380c0da/network-check-target-container/0.log" Apr 24 21:49:05.252367 ip-10-0-136-65 kubenswrapper[2577]: I0424 21:49:05.252337 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-84qsj_c85c2f4a-3310-4862-b16b-7dd95f352625/iptables-alerter/0.log"