Apr 16 16:23:46.059083 ip-10-0-141-93 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:23:46.601971 ip-10-0-141-93 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:23:46.601971 ip-10-0-141-93 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:23:46.601971 ip-10-0-141-93 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:23:46.601971 ip-10-0-141-93 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:23:46.601971 ip-10-0-141-93 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:23:46.606624 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.606560 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:23:46.612354 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612340 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:46.612354 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612354 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612358 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612361 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612364 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612367 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612370 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612373 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612376 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612378 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612381 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612384 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612386 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612394 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612397 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612400 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612402 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612405 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612408 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612410 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612413 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:46.612413 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612416 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612419 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612422 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612425 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612428 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612431 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612433 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612436 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612438 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612441 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612443 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612446 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612448 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612451 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612454 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612456 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612459 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612461 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612463 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612467 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:46.612937 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612470 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612472 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612475 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612478 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612480 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612483 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612485 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612488 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612490 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612493 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612496 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612499 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612503 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612506 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612509 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612512 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612515 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612517 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612520 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:46.613427 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612523 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612525 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612528 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612531 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612534 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612536 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612545 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612548 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612550 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612553 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612555 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612558 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612560 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612562 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612565 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612568 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612571 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612589 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612593 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612596 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:46.613996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612599 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612604 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612607 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612610 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612613 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.612617 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613732 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613740 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613743 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613747 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613749 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613752 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613755 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613758 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613761 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613764 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613767 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613769 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613778 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:46.614476 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613781 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613783 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613786 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613789 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613792 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613804 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613808 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613811 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613814 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613817 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613819 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613822 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613824 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613827 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613829 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613832 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613834 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613837 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613840 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613842 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:46.614950 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613844 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613847 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613849 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613852 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613854 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613857 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613859 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613862 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613865 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613867 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613870 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613872 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613882 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613885 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613887 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613890 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613893 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613896 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613899 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613902 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:46.615433 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613905 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613908 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613911 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613913 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613916 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613918 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613921 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613923 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613926 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613928 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613931 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613933 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613936 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613939 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613941 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613944 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613947 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613951 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613954 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:46.615936 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613957 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613961 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613965 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613968 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613971 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613974 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613982 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613985 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613988 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613990 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613993 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613995 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.613999 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614002 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614068 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614079 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614092 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614096 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614100 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614103 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614110 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:23:46.616408 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614114 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614117 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614120 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614123 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614127 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614130 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614133 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614135 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614138 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614141 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614144 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614148 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614155 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614158 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614161 2576 flags.go:64] FLAG: --config-dir="" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614164 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614167 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614171 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614179 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614183 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614186 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614189 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614192 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614195 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614199 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:23:46.616939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614202 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614206 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614208 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614211 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614214 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614217 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614220 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614226 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614229 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614232 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614235 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614238 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614242 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614245 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614248 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614251 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614254 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614256 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614259 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614263 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614266 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614269 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614272 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614276 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614279 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:23:46.617538 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614282 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614291 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614294 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614297 2576 flags.go:64] FLAG: --help="false" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614300 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-141-93.ec2.internal" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614304 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614307 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614310 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614313 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614317 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614319 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614322 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614325 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614328 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614331 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614334 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614337 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614340 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614343 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614346 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614349 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614351 2576 flags.go:64] FLAG: --lock-file="" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614354 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614357 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:23:46.618145 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614360 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614365 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614368 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614371 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614374 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614376 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614380 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614383 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614386 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614390 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614399 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614403 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614406 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614410 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614413 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614416 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614419 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614422 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614424 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614431 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614434 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614437 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614440 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:23:46.618989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614443 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614448 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614451 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614454 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614457 2576 flags.go:64] FLAG: --port="10250" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614460 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614463 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04ca85c3d59fc2238" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614466 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614469 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614472 2576 flags.go:64] FLAG: --register-node="true" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614475 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614478 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614482 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614485 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614488 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614491 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614495 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614498 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614501 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614504 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614513 2576 flags.go:64] FLAG: --runonce="false" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614516 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614521 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614524 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614527 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614530 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:23:46.619545 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614533 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614536 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614539 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614542 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614545 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614548 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614550 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614553 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614556 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614559 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614564 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614567 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614570 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614577 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614580 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614582 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614585 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614591 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614594 2576 flags.go:64] FLAG: --v="2" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614598 2576 flags.go:64] FLAG: --version="false" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614602 2576 flags.go:64] FLAG: --vmodule="" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614612 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.614615 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614728 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:46.620203 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614731 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614734 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614737 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614746 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614751 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614753 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614756 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614759 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614762 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614764 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614767 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614769 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614772 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614775 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614777 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614780 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614783 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614785 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614788 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:46.620808 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614790 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614795 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614798 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614801 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614804 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614807 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614811 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614813 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614816 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614818 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614821 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614824 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614826 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614829 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614831 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614834 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614837 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614847 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614849 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614852 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:46.621273 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614854 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614857 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614860 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614862 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614865 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614868 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614870 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614873 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614875 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614878 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614880 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614883 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614885 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614888 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614891 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614893 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614896 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614898 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614902 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:46.621766 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614905 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614907 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614912 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614915 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614918 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614920 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614923 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614925 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614928 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614931 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614935 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614943 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614946 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614948 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614951 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614954 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614956 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614959 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614962 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614965 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:46.622240 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614968 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614970 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614973 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614976 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614978 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614981 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.614983 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.616014 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.621840 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.621855 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621898 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621902 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621906 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621909 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621912 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621915 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:46.622779 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621918 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621920 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621923 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621925 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621928 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621931 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621934 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621936 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621939 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621942 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621944 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621947 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621950 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621953 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621956 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621958 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621961 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621963 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621966 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621969 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:46.623178 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621971 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621974 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621976 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621979 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621981 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621985 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621988 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621991 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621994 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621996 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.621999 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622001 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622004 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622007 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622009 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622012 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622016 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622021 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622024 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:46.623655 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622026 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622029 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622032 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622035 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622038 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622041 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622045 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622048 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622051 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622054 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622056 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622059 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622062 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622064 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622067 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622069 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622072 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622074 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622078 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:46.624132 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622087 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622090 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622093 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622096 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622098 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622101 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622104 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622106 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622109 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622111 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622114 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622116 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622119 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622122 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622124 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622126 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622129 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622131 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622134 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622136 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:46.624588 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622139 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622141 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.622147 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622263 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622268 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622271 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622274 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622277 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622280 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622282 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622285 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622289 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622293 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622302 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:46.625124 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622305 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622308 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622310 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622313 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622316 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622319 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622321 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622324 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622327 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622330 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622333 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622336 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622339 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622342 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622344 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622347 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622350 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622353 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622355 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622358 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:46.625479 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622361 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622363 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622365 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622368 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622370 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622373 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622376 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622378 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622381 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622384 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622387 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622389 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622397 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622400 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622403 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622405 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622408 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622410 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622413 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:46.625987 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622415 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622418 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622420 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622423 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622426 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622428 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622431 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622433 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622436 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622438 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622441 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622444 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622446 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622449 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622451 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622455 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622458 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622461 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622463 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:46.626441 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622466 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622468 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622471 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622473 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622476 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622479 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622481 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622492 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622495 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622497 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622500 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622502 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622505 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622507 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622510 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622513 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:46.626911 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:46.622515 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:46.627293 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.622520 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:23:46.627293 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.623227 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:23:46.627293 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.625185 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:23:46.627293 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.626171 2576 server.go:1019] "Starting client certificate rotation" Apr 16 16:23:46.627293 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.626267 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:23:46.627293 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.626308 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:23:46.660827 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.660813 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:23:46.667025 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.667007 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:23:46.685936 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.685916 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:23:46.691709 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.691697 2576 log.go:25] "Validated CRI v1 image API" Apr 16 16:23:46.692980 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.692957 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:23:46.696954 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.696938 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:23:46.698618 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.698601 2576 fs.go:135] Filesystem UUIDs: map[43852bfe-f250-43a5-af8b-34911af25806:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 dfc8d819-71e2-49ba-a3ad-b4b4bc442a02:/dev/nvme0n1p4] Apr 16 16:23:46.698670 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.698618 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:23:46.704832 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.704735 2576 manager.go:217] Machine: {Timestamp:2026-04-16 16:23:46.702170995 +0000 UTC m=+0.491519111 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3113212 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2374332ddae0962cec7866489c0400 SystemUUID:ec237433-2dda-e096-2cec-7866489c0400 BootID:71f9c5dc-9074-4769-ac82-87af36f4b691 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f7:53:ac:12:89 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f7:53:ac:12:89 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:e1:fc:cc:09:70 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:23:46.704832 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.704828 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:23:46.704975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.704893 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:23:46.705218 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.705197 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:23:46.705341 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.705219 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-93.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:23:46.705382 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.705349 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:23:46.705382 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.705358 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:23:46.705382 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.705370 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:23:46.707816 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.707804 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:23:46.709173 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.709163 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:23:46.709383 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.709374 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:23:46.712528 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.712518 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:23:46.712564 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.712531 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:23:46.712564 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.712545 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:23:46.712564 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.712554 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:23:46.712564 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.712562 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:23:46.713961 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.713950 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:23:46.714005 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.713967 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:23:46.719932 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.719915 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:23:46.721817 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.721797 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:23:46.723305 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723290 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:23:46.723355 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723309 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:23:46.723355 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723316 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:23:46.723355 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723321 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:23:46.723355 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723326 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:23:46.723355 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723331 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:23:46.723355 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723337 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:23:46.723355 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723342 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:23:46.723355 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723349 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:23:46.723355 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723355 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:23:46.723590 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723364 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:23:46.723590 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.723373 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:23:46.725065 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.725048 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:23:46.725065 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.725063 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:23:46.728341 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.728322 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-93.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:23:46.728426 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.728341 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:23:46.728426 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.728365 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:23:46.728426 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.728392 2576 server.go:1295] "Started kubelet" Apr 16 16:23:46.728426 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.728395 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-93.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:23:46.728609 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.728475 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:23:46.728609 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.728479 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:23:46.728609 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.728542 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:23:46.729013 ip-10-0-141-93 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:23:46.729549 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.729496 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:23:46.731492 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.731477 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:23:46.735533 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.735515 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:23:46.735903 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.735889 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:23:46.736615 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.736600 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:23:46.736734 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.736647 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:23:46.736825 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.736815 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:23:46.736950 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.736935 2576 factory.go:153] Registering CRI-O factory Apr 16 16:23:46.737018 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.736953 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 16:23:46.737069 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.737019 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:23:46.737069 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.737029 2576 factory.go:55] Registering systemd factory Apr 16 16:23:46.737069 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.737037 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:23:46.737069 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.737057 2576 factory.go:103] Registering Raw factory Apr 16 16:23:46.737240 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.737081 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 16:23:46.737365 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.737177 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:23:46.737446 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.737433 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:23:46.737517 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.737504 2576 manager.go:319] Starting recovery of all containers Apr 16 16:23:46.737825 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.736799 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:46.739167 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.739075 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:23:46.746041 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.745886 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:23:46.746041 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.745908 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-93.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:23:46.749511 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.745902 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-93.ec2.internal.18a6e2f0c28e5255 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-93.ec2.internal,UID:ip-10-0-141-93.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-93.ec2.internal,},FirstTimestamp:2026-04-16 16:23:46.728374869 +0000 UTC m=+0.517722988,LastTimestamp:2026-04-16 16:23:46.728374869 +0000 UTC m=+0.517722988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-93.ec2.internal,}" Apr 16 16:23:46.749511 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.749491 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rhhj8" Apr 16 16:23:46.752534 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.752517 2576 manager.go:324] Recovery completed Apr 16 16:23:46.756399 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.756387 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:46.758906 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.758890 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:46.758967 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.758917 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:46.758967 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.758927 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:46.759363 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.759352 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:23:46.759407 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.759365 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:23:46.759407 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.759381 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:23:46.761449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.761436 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rhhj8" Apr 16 16:23:46.762392 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.762378 2576 policy_none.go:49] "None policy: Start" Apr 16 16:23:46.762445 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.762435 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:23:46.762479 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.762452 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:23:46.797208 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.797193 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 16:23:46.821782 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.797218 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:23:46.821782 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.797232 2576 server.go:85] "Starting device plugin registration server" Apr 16 16:23:46.821782 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.797472 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:23:46.821782 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.797485 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:23:46.821782 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.797583 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:23:46.821782 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.797698 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:23:46.821782 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.797707 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:23:46.821782 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.798131 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:23:46.821782 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.798167 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:46.837201 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.837169 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:23:46.838298 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.838279 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:23:46.838358 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.838305 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:23:46.838358 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.838323 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:23:46.838358 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.838330 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:23:46.838465 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.838361 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:23:46.840493 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.840476 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:46.898163 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.898109 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:46.898862 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.898845 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:46.898932 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.898877 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:46.898932 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.898888 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:46.898932 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.898907 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-93.ec2.internal" Apr 16 16:23:46.907654 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.907641 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-93.ec2.internal" Apr 16 16:23:46.907713 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.907659 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-93.ec2.internal\": node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:46.930258 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.930240 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:46.938551 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.938533 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal"] Apr 16 16:23:46.938611 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.938601 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:46.939418 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.939404 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:46.939473 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.939429 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:46.939473 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.939438 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:46.941703 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.941691 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:46.941851 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.941838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" Apr 16 16:23:46.941889 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.941865 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:46.942264 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.942246 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:46.942337 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.942271 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:46.942337 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.942281 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:46.942337 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.942283 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:46.942337 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.942304 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:46.942337 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.942318 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:46.944962 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.944947 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal" Apr 16 16:23:46.945043 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.944970 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:46.945520 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.945505 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:46.945599 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.945530 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:46.945599 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:46.945540 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:46.971145 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.971127 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-93.ec2.internal\" not found" node="ip-10-0-141-93.ec2.internal" Apr 16 16:23:46.975402 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:46.975388 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-93.ec2.internal\" not found" node="ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.030499 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.030479 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:47.038864 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.038844 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d69bb2ba9c62ea055a62e31adcc63bda-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal\" (UID: \"d69bb2ba9c62ea055a62e31adcc63bda\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.038924 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.038868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d69bb2ba9c62ea055a62e31adcc63bda-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal\" (UID: \"d69bb2ba9c62ea055a62e31adcc63bda\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.038924 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.038887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0b233db55a7ade7d393ce1a96715106e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-93.ec2.internal\" (UID: \"0b233db55a7ade7d393ce1a96715106e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.131101 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.131080 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:47.139537 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.139516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d69bb2ba9c62ea055a62e31adcc63bda-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal\" (UID: \"d69bb2ba9c62ea055a62e31adcc63bda\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.139597 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.139486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d69bb2ba9c62ea055a62e31adcc63bda-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal\" (UID: \"d69bb2ba9c62ea055a62e31adcc63bda\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.139597 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.139586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0b233db55a7ade7d393ce1a96715106e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-93.ec2.internal\" (UID: \"0b233db55a7ade7d393ce1a96715106e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.139670 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.139605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d69bb2ba9c62ea055a62e31adcc63bda-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal\" (UID: \"d69bb2ba9c62ea055a62e31adcc63bda\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.139670 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.139645 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d69bb2ba9c62ea055a62e31adcc63bda-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal\" (UID: \"d69bb2ba9c62ea055a62e31adcc63bda\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.139764 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.139668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0b233db55a7ade7d393ce1a96715106e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-93.ec2.internal\" (UID: \"0b233db55a7ade7d393ce1a96715106e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.231919 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.231866 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:47.275344 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.275318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.277696 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.277659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.332138 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.332111 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:47.432686 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.432649 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:47.533131 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.533099 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:47.625461 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.625435 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:23:47.625461 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.625449 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:47.625919 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.625556 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:23:47.634268 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.634249 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-93.ec2.internal\" not found" Apr 16 16:23:47.636609 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.636592 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:47.713433 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.713407 2576 apiserver.go:52] "Watching apiserver" Apr 16 16:23:47.725877 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.725848 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:23:47.726595 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.726567 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9ctsd","openshift-cluster-node-tuning-operator/tuned-drsbg","openshift-dns/node-resolver-tqwdh","openshift-image-registry/node-ca-zfglr","openshift-multus/multus-additional-cni-plugins-kpjzv","openshift-network-diagnostics/network-check-target-t7shk","openshift-network-operator/iptables-alerter-r5v2l","kube-system/konnectivity-agent-2jhj4","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw","openshift-multus/multus-x2nrk","openshift-multus/network-metrics-daemon-6dvl8"] Apr 16 16:23:47.731618 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.731590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.733908 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.733889 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.733908 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.733913 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-84d79\"" Apr 16 16:23:47.734097 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.733929 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:23:47.734097 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.733955 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:23:47.735669 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.735644 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:23:47.736060 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.736031 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:23:47.736060 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.736047 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.736314 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.736271 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:23:47.736433 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.736316 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-svp7v\"" Apr 16 16:23:47.736433 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.736406 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:23:47.736615 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.736411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:47.736615 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.736563 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:23:47.736615 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.736601 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.736909 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.736884 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:23:47.738891 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.738869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.739745 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.739721 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:23:47.739836 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.739771 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:23:47.739836 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.739788 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-62zc7\"" Apr 16 16:23:47.739836 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.739797 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:23:47.739836 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.739799 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:23:47.740006 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.739863 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:23:47.741643 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8593e95d-58e6-43c3-99b0-5582e1e25f39-tmp-dir\") pod \"node-resolver-tqwdh\" (UID: \"8593e95d-58e6-43c3-99b0-5582e1e25f39\") " pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.741806 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741661 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-system-cni-dir\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.741806 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.741806 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-socket-dir-parent\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.741806 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-run-systemd\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.741806 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-env-overrides\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741825 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jtp8d\"" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8593e95d-58e6-43c3-99b0-5582e1e25f39-hosts-file\") pod \"node-resolver-tqwdh\" (UID: \"8593e95d-58e6-43c3-99b0-5582e1e25f39\") " pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741847 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-run-ovn\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-kubernetes\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc5dz\" (UniqueName: \"kubernetes.io/projected/8593e95d-58e6-43c3-99b0-5582e1e25f39-kube-api-access-tc5dz\") pod \"node-resolver-tqwdh\" (UID: \"8593e95d-58e6-43c3-99b0-5582e1e25f39\") " pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.741985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-cnibin\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e7fea14-3672-4005-bbe8-e59d933d3173-cni-binary-copy\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnnm4\" (UniqueName: \"kubernetes.io/projected/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-kube-api-access-bnnm4\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-systemd-units\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-log-socket\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kptkl\" (UniqueName: \"kubernetes.io/projected/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-kube-api-access-kptkl\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-sys\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ee219be2-6e0e-45ac-874e-43970e574181-etc-tuned\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-node-log\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742221 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-cni-netd\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-sysctl-conf\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742277 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wch5\" (UniqueName: \"kubernetes.io/projected/ee219be2-6e0e-45ac-874e-43970e574181-kube-api-access-4wch5\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-var-lib-kubelet\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-run-multus-certs\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742329 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-etc-kubernetes\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-cni-dir\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-var-lib-openvswitch\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-ovnkube-script-lib\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-ovnkube-config\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-sysconfig\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.742600 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-var-lib-cni-bin\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-run-openvswitch\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-os-release\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx2gl\" (UniqueName: \"kubernetes.io/projected/6e7fea14-3672-4005-bbe8-e59d933d3173-kube-api-access-tx2gl\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-run-netns\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-var-lib-cni-multus\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742726 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-conf-dir\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-kubelet\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-slash\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-cni-bin\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-sysctl-d\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742909 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-lib-modules\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.742930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-hostroot\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743041 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-ovn-node-metrics-cert\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-run-k8s-cni-cncf-io\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.743449 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-modprobe-d\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-systemd\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-var-lib-kubelet\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee219be2-6e0e-45ac-874e-43970e574181-tmp\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-daemon-config\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-run\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-host\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-run-netns\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-etc-openvswitch\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-run-ovn-kubernetes\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743833 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-f86px\"" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:23:47.744124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.743963 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:23:47.744606 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.744229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.744606 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.744241 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:23:47.746573 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.746553 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:47.746726 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.746658 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:23:47.746726 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.746648 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:23:47.746900 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.746787 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rnf7g\"" Apr 16 16:23:47.746967 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.746944 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:23:47.748997 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.748977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:47.751014 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.750982 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:23:47.751115 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.751072 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal" Apr 16 16:23:47.751560 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.751539 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:23:47.751661 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.751604 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2q4vz\"" Apr 16 16:23:47.751758 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.751735 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:23:47.752259 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.752017 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:23:47.752259 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.752103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:23:47.756530 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.756511 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal"] Apr 16 16:23:47.756660 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.756647 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.758892 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.758837 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:23:47.759131 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.759115 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qcdj8\"" Apr 16 16:23:47.759728 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.759707 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:23:47.760481 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.760465 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:23:47.760526 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.760514 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:23:47.760778 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.760765 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:23:47.761132 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.761117 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:23:47.761412 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.761397 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-d5kdb\"" Apr 16 16:23:47.762627 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.762605 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:18:46 +0000 UTC" deadline="2028-01-13 00:49:17.241841466 +0000 UTC" Apr 16 16:23:47.762685 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.762626 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15272h25m29.479218067s" Apr 16 16:23:47.764279 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.764263 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal"] Apr 16 16:23:47.764482 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.764471 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:23:47.777344 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.777322 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m546t" Apr 16 16:23:47.786713 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.786660 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m546t" Apr 16 16:23:47.837491 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.837472 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:23:47.843597 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-etc-selinux\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.843694 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/063cc77b-5a11-4e5a-a733-15acf54a40e8-agent-certs\") pod \"konnectivity-agent-2jhj4\" (UID: \"063cc77b-5a11-4e5a-a733-15acf54a40e8\") " pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:23:47.843694 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8593e95d-58e6-43c3-99b0-5582e1e25f39-tmp-dir\") pod \"node-resolver-tqwdh\" (UID: \"8593e95d-58e6-43c3-99b0-5582e1e25f39\") " pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.843694 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-system-cni-dir\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.843694 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-socket-dir-parent\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.843694 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-run-systemd\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-env-overrides\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/32a91c59-c74e-45df-ab79-de8449b1b1e3-serviceca\") pod \"node-ca-zfglr\" (UID: \"32a91c59-c74e-45df-ab79-de8449b1b1e3\") " pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-system-cni-dir\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-socket-dir-parent\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-run-systemd\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmvp8\" (UniqueName: \"kubernetes.io/projected/ddc36c0b-3388-4a0a-a038-6e0a618d18c2-kube-api-access-lmvp8\") pod \"iptables-alerter-r5v2l\" (UID: \"ddc36c0b-3388-4a0a-a038-6e0a618d18c2\") " pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8593e95d-58e6-43c3-99b0-5582e1e25f39-hosts-file\") pod \"node-resolver-tqwdh\" (UID: \"8593e95d-58e6-43c3-99b0-5582e1e25f39\") " pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8593e95d-58e6-43c3-99b0-5582e1e25f39-tmp-dir\") pod \"node-resolver-tqwdh\" (UID: \"8593e95d-58e6-43c3-99b0-5582e1e25f39\") " pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-run-ovn\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.843912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-kubernetes\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-cnibin\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8593e95d-58e6-43c3-99b0-5582e1e25f39-hosts-file\") pod \"node-resolver-tqwdh\" (UID: \"8593e95d-58e6-43c3-99b0-5582e1e25f39\") " pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-kubernetes\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-run-ovn\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n5mm\" (UniqueName: \"kubernetes.io/projected/51f23044-6510-4235-820f-fdca93d4bab6-kube-api-access-5n5mm\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.843988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-sys-fs\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5dz\" (UniqueName: \"kubernetes.io/projected/8593e95d-58e6-43c3-99b0-5582e1e25f39-kube-api-access-tc5dz\") pod \"node-resolver-tqwdh\" (UID: \"8593e95d-58e6-43c3-99b0-5582e1e25f39\") " pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkd46\" (UniqueName: \"kubernetes.io/projected/32a91c59-c74e-45df-ab79-de8449b1b1e3-kube-api-access-qkd46\") pod \"node-ca-zfglr\" (UID: \"32a91c59-c74e-45df-ab79-de8449b1b1e3\") " pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-cnibin\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-cnibin\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-env-overrides\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e7fea14-3672-4005-bbe8-e59d933d3173-cni-binary-copy\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnnm4\" (UniqueName: \"kubernetes.io/projected/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-kube-api-access-bnnm4\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:47.844244 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-systemd-units\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-systemd-units\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-log-socket\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-log-socket\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kptkl\" (UniqueName: \"kubernetes.io/projected/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-kube-api-access-kptkl\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-sys\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ee219be2-6e0e-45ac-874e-43970e574181-etc-tuned\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-sys\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-node-log\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-cni-netd\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-node-log\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-sysctl-conf\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wch5\" (UniqueName: \"kubernetes.io/projected/ee219be2-6e0e-45ac-874e-43970e574181-kube-api-access-4wch5\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-cni-netd\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-registration-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e7fea14-3672-4005-bbe8-e59d933d3173-cni-binary-copy\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.844730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-sysctl-conf\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn2v2\" (UniqueName: \"kubernetes.io/projected/b7c4234e-0528-40b5-b909-5324aef04be7-kube-api-access-wn2v2\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-var-lib-kubelet\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844659 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-run-multus-certs\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-run-multus-certs\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-etc-kubernetes\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-var-lib-kubelet\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-etc-kubernetes\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-os-release\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ddc36c0b-3388-4a0a-a038-6e0a618d18c2-iptables-alerter-script\") pod \"iptables-alerter-r5v2l\" (UID: \"ddc36c0b-3388-4a0a-a038-6e0a618d18c2\") " pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-cni-dir\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-var-lib-openvswitch\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-ovnkube-script-lib\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-cni-dir\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-var-lib-openvswitch\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.844974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-ovnkube-config\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-sysconfig\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.845308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-sysconfig\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/51f23044-6510-4235-820f-fdca93d4bab6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/063cc77b-5a11-4e5a-a733-15acf54a40e8-konnectivity-ca\") pod \"konnectivity-agent-2jhj4\" (UID: \"063cc77b-5a11-4e5a-a733-15acf54a40e8\") " pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-var-lib-cni-bin\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-run-openvswitch\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-var-lib-cni-bin\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-run-openvswitch\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-os-release\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx2gl\" (UniqueName: \"kubernetes.io/projected/6e7fea14-3672-4005-bbe8-e59d933d3173-kube-api-access-tx2gl\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-run-netns\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-os-release\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-var-lib-cni-multus\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-conf-dir\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-var-lib-cni-multus\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-ovnkube-script-lib\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51f23044-6510-4235-820f-fdca93d4bab6-cni-binary-copy\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.845975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32a91c59-c74e-45df-ab79-de8449b1b1e3-host\") pod \"node-ca-zfglr\" (UID: \"32a91c59-c74e-45df-ab79-de8449b1b1e3\") " pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-conf-dir\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-run-netns\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-ovnkube-config\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.845515 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-kubelet\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-slash\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-slash\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.845606 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs podName:6f605cef-2cd0-4480-b5a0-4bb58f196ac7 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:48.345561117 +0000 UTC m=+2.134909222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs") pod "network-metrics-daemon-6dvl8" (UID: "6f605cef-2cd0-4480-b5a0-4bb58f196ac7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-kubelet\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-cni-bin\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-sysctl-d\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-lib-modules\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-cni-bin\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-socket-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-hostroot\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.846835 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-sysctl-d\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-lib-modules\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-hostroot\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-ovn-node-metrics-cert\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-system-cni-dir\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51f23044-6510-4235-820f-fdca93d4bab6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-run-k8s-cni-cncf-io\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.845987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-modprobe-d\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-systemd\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-var-lib-kubelet\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee219be2-6e0e-45ac-874e-43970e574181-tmp\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ddc36c0b-3388-4a0a-a038-6e0a618d18c2-host-slash\") pod \"iptables-alerter-r5v2l\" (UID: \"ddc36c0b-3388-4a0a-a038-6e0a618d18c2\") " pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-systemd\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-daemon-config\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.847638 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-run\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-host\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-device-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-run-netns\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-etc-openvswitch\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-var-lib-kubelet\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-run-ovn-kubernetes\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-host-run-ovn-kubernetes\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-host\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-etc-modprobe-d\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-run-netns\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-etc-openvswitch\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e7fea14-3672-4005-bbe8-e59d933d3173-host-run-k8s-cni-cncf-io\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ee219be2-6e0e-45ac-874e-43970e574181-run\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.848541 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.846799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e7fea14-3672-4005-bbe8-e59d933d3173-multus-daemon-config\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.849205 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.848869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-ovn-node-metrics-cert\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.849205 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.849068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ee219be2-6e0e-45ac-874e-43970e574181-etc-tuned\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.849205 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.849090 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee219be2-6e0e-45ac-874e-43970e574181-tmp\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.853896 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.853871 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kptkl\" (UniqueName: \"kubernetes.io/projected/c2efb54c-bc99-4126-ac22-6f7a17d6cd42-kube-api-access-kptkl\") pod \"ovnkube-node-9ctsd\" (UID: \"c2efb54c-bc99-4126-ac22-6f7a17d6cd42\") " pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:47.853975 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.853903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc5dz\" (UniqueName: \"kubernetes.io/projected/8593e95d-58e6-43c3-99b0-5582e1e25f39-kube-api-access-tc5dz\") pod \"node-resolver-tqwdh\" (UID: \"8593e95d-58e6-43c3-99b0-5582e1e25f39\") " pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:47.856217 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.856194 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wch5\" (UniqueName: \"kubernetes.io/projected/ee219be2-6e0e-45ac-874e-43970e574181-kube-api-access-4wch5\") pod \"tuned-drsbg\" (UID: \"ee219be2-6e0e-45ac-874e-43970e574181\") " pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:47.857019 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.857001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnnm4\" (UniqueName: \"kubernetes.io/projected/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-kube-api-access-bnnm4\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:47.857219 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.857201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx2gl\" (UniqueName: \"kubernetes.io/projected/6e7fea14-3672-4005-bbe8-e59d933d3173-kube-api-access-tx2gl\") pod \"multus-x2nrk\" (UID: \"6e7fea14-3672-4005-bbe8-e59d933d3173\") " pod="openshift-multus/multus-x2nrk" Apr 16 16:23:47.940942 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:47.940916 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b233db55a7ade7d393ce1a96715106e.slice/crio-8d4b6ca1cab356aacd0d5e2099f2f0a4345251bd6f654070bee3f65592bf4d39 WatchSource:0}: Error finding container 8d4b6ca1cab356aacd0d5e2099f2f0a4345251bd6f654070bee3f65592bf4d39: Status 404 returned error can't find the container with id 8d4b6ca1cab356aacd0d5e2099f2f0a4345251bd6f654070bee3f65592bf4d39 Apr 16 16:23:47.941404 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:47.941380 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd69bb2ba9c62ea055a62e31adcc63bda.slice/crio-3d887ad27209fc35dca1e1750797368a161d99f5bd94ff4fe0843f1322e59db2 WatchSource:0}: Error finding container 3d887ad27209fc35dca1e1750797368a161d99f5bd94ff4fe0843f1322e59db2: Status 404 returned error can't find the container with id 3d887ad27209fc35dca1e1750797368a161d99f5bd94ff4fe0843f1322e59db2 Apr 16 16:23:47.945101 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.945087 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:23:47.946842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.946824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:47.946914 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.946850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-registration-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.946914 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.946865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn2v2\" (UniqueName: \"kubernetes.io/projected/b7c4234e-0528-40b5-b909-5324aef04be7-kube-api-access-wn2v2\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.946914 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.946881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-os-release\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.946914 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.946895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ddc36c0b-3388-4a0a-a038-6e0a618d18c2-iptables-alerter-script\") pod \"iptables-alerter-r5v2l\" (UID: \"ddc36c0b-3388-4a0a-a038-6e0a618d18c2\") " pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:47.947094 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.946919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/51f23044-6510-4235-820f-fdca93d4bab6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.947094 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.946933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-registration-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.947094 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.946967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-os-release\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.947094 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.946979 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/063cc77b-5a11-4e5a-a733-15acf54a40e8-konnectivity-ca\") pod \"konnectivity-agent-2jhj4\" (UID: \"063cc77b-5a11-4e5a-a733-15acf54a40e8\") " pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:23:47.947094 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51f23044-6510-4235-820f-fdca93d4bab6-cni-binary-copy\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.947094 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32a91c59-c74e-45df-ab79-de8449b1b1e3-host\") pod \"node-ca-zfglr\" (UID: \"32a91c59-c74e-45df-ab79-de8449b1b1e3\") " pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.947094 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-socket-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-system-cni-dir\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51f23044-6510-4235-820f-fdca93d4bab6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32a91c59-c74e-45df-ab79-de8449b1b1e3-host\") pod \"node-ca-zfglr\" (UID: \"32a91c59-c74e-45df-ab79-de8449b1b1e3\") " pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ddc36c0b-3388-4a0a-a038-6e0a618d18c2-host-slash\") pod \"iptables-alerter-r5v2l\" (UID: \"ddc36c0b-3388-4a0a-a038-6e0a618d18c2\") " pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-device-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-etc-selinux\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/063cc77b-5a11-4e5a-a733-15acf54a40e8-agent-certs\") pod \"konnectivity-agent-2jhj4\" (UID: \"063cc77b-5a11-4e5a-a733-15acf54a40e8\") " pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947306 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-socket-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/32a91c59-c74e-45df-ab79-de8449b1b1e3-serviceca\") pod \"node-ca-zfglr\" (UID: \"32a91c59-c74e-45df-ab79-de8449b1b1e3\") " pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-device-dir\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmvp8\" (UniqueName: \"kubernetes.io/projected/ddc36c0b-3388-4a0a-a038-6e0a618d18c2-kube-api-access-lmvp8\") pod \"iptables-alerter-r5v2l\" (UID: \"ddc36c0b-3388-4a0a-a038-6e0a618d18c2\") " pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-cnibin\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.947428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5n5mm\" (UniqueName: \"kubernetes.io/projected/51f23044-6510-4235-820f-fdca93d4bab6-kube-api-access-5n5mm\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-etc-selinux\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-sys-fs\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkd46\" (UniqueName: \"kubernetes.io/projected/32a91c59-c74e-45df-ab79-de8449b1b1e3-kube-api-access-qkd46\") pod \"node-ca-zfglr\" (UID: \"32a91c59-c74e-45df-ab79-de8449b1b1e3\") " pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b7c4234e-0528-40b5-b909-5324aef04be7-sys-fs\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/063cc77b-5a11-4e5a-a733-15acf54a40e8-konnectivity-ca\") pod \"konnectivity-agent-2jhj4\" (UID: \"063cc77b-5a11-4e5a-a733-15acf54a40e8\") " pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/51f23044-6510-4235-820f-fdca93d4bab6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ddc36c0b-3388-4a0a-a038-6e0a618d18c2-iptables-alerter-script\") pod \"iptables-alerter-r5v2l\" (UID: \"ddc36c0b-3388-4a0a-a038-6e0a618d18c2\") " pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51f23044-6510-4235-820f-fdca93d4bab6-cni-binary-copy\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-system-cni-dir\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-cnibin\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ddc36c0b-3388-4a0a-a038-6e0a618d18c2-host-slash\") pod \"iptables-alerter-r5v2l\" (UID: \"ddc36c0b-3388-4a0a-a038-6e0a618d18c2\") " pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/51f23044-6510-4235-820f-fdca93d4bab6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.948159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.947957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/32a91c59-c74e-45df-ab79-de8449b1b1e3-serviceca\") pod \"node-ca-zfglr\" (UID: \"32a91c59-c74e-45df-ab79-de8449b1b1e3\") " pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.948659 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.948164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51f23044-6510-4235-820f-fdca93d4bab6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.949753 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.949735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/063cc77b-5a11-4e5a-a733-15acf54a40e8-agent-certs\") pod \"konnectivity-agent-2jhj4\" (UID: \"063cc77b-5a11-4e5a-a733-15acf54a40e8\") " pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:23:47.953904 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.953887 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:47.953904 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.953907 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:47.954046 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.953920 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfkt6 for pod openshift-network-diagnostics/network-check-target-t7shk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:47.954046 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:47.954015 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6 podName:903cab10-206d-4fef-bebf-bbf8db046d19 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:48.453995904 +0000 UTC m=+2.243344012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cfkt6" (UniqueName: "kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6") pod "network-check-target-t7shk" (UID: "903cab10-206d-4fef-bebf-bbf8db046d19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:47.955191 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.955168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkd46\" (UniqueName: \"kubernetes.io/projected/32a91c59-c74e-45df-ab79-de8449b1b1e3-kube-api-access-qkd46\") pod \"node-ca-zfglr\" (UID: \"32a91c59-c74e-45df-ab79-de8449b1b1e3\") " pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:47.955290 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.955270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn2v2\" (UniqueName: \"kubernetes.io/projected/b7c4234e-0528-40b5-b909-5324aef04be7-kube-api-access-wn2v2\") pod \"aws-ebs-csi-driver-node-vf5rw\" (UID: \"b7c4234e-0528-40b5-b909-5324aef04be7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:47.955552 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.955535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n5mm\" (UniqueName: \"kubernetes.io/projected/51f23044-6510-4235-820f-fdca93d4bab6-kube-api-access-5n5mm\") pod \"multus-additional-cni-plugins-kpjzv\" (UID: \"51f23044-6510-4235-820f-fdca93d4bab6\") " pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:47.955883 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:47.955868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmvp8\" (UniqueName: \"kubernetes.io/projected/ddc36c0b-3388-4a0a-a038-6e0a618d18c2-kube-api-access-lmvp8\") pod \"iptables-alerter-r5v2l\" (UID: \"ddc36c0b-3388-4a0a-a038-6e0a618d18c2\") " pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:48.059966 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.059895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tqwdh" Apr 16 16:23:48.066048 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:48.066027 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8593e95d_58e6_43c3_99b0_5582e1e25f39.slice/crio-674e8b8d0d8c885ffd24bb25e64d28d478fc16b90280f609e7bcc1a6eed42888 WatchSource:0}: Error finding container 674e8b8d0d8c885ffd24bb25e64d28d478fc16b90280f609e7bcc1a6eed42888: Status 404 returned error can't find the container with id 674e8b8d0d8c885ffd24bb25e64d28d478fc16b90280f609e7bcc1a6eed42888 Apr 16 16:23:48.076905 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.076887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x2nrk" Apr 16 16:23:48.082996 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:48.082976 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7fea14_3672_4005_bbe8_e59d933d3173.slice/crio-dff26e9ae72cb5d03201df77babb2095749f4fbd4df5e5a4881ef6f86ee17925 WatchSource:0}: Error finding container dff26e9ae72cb5d03201df77babb2095749f4fbd4df5e5a4881ef6f86ee17925: Status 404 returned error can't find the container with id dff26e9ae72cb5d03201df77babb2095749f4fbd4df5e5a4881ef6f86ee17925 Apr 16 16:23:48.097284 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.097268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:23:48.102828 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:48.102804 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2efb54c_bc99_4126_ac22_6f7a17d6cd42.slice/crio-de1b9e82eb5a1d60a905f353c8f70d8b70387dc9b3ab8b8bea0f376df7159207 WatchSource:0}: Error finding container de1b9e82eb5a1d60a905f353c8f70d8b70387dc9b3ab8b8bea0f376df7159207: Status 404 returned error can't find the container with id de1b9e82eb5a1d60a905f353c8f70d8b70387dc9b3ab8b8bea0f376df7159207 Apr 16 16:23:48.108577 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.108561 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-drsbg" Apr 16 16:23:48.114147 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:48.114124 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee219be2_6e0e_45ac_874e_43970e574181.slice/crio-5dc739dae3a605394d4ef76f42254e573f39a52b776e55a59bf39087c2aae80f WatchSource:0}: Error finding container 5dc739dae3a605394d4ef76f42254e573f39a52b776e55a59bf39087c2aae80f: Status 404 returned error can't find the container with id 5dc739dae3a605394d4ef76f42254e573f39a52b776e55a59bf39087c2aae80f Apr 16 16:23:48.125883 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.125867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zfglr" Apr 16 16:23:48.130662 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.130632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" Apr 16 16:23:48.130798 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:48.130780 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32a91c59_c74e_45df_ab79_de8449b1b1e3.slice/crio-13aca97c95c9cc0c72345963c32342e644060ce14c48626f97a9496f85800c7a WatchSource:0}: Error finding container 13aca97c95c9cc0c72345963c32342e644060ce14c48626f97a9496f85800c7a: Status 404 returned error can't find the container with id 13aca97c95c9cc0c72345963c32342e644060ce14c48626f97a9496f85800c7a Apr 16 16:23:48.135837 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:48.135820 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f23044_6510_4235_820f_fdca93d4bab6.slice/crio-45dd9df8217a24f24534a06a2e7e7fba679c95918dbaab88cdb8b047991395bb WatchSource:0}: Error finding container 45dd9df8217a24f24534a06a2e7e7fba679c95918dbaab88cdb8b047991395bb: Status 404 returned error can't find the container with id 45dd9df8217a24f24534a06a2e7e7fba679c95918dbaab88cdb8b047991395bb Apr 16 16:23:48.160592 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.160575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r5v2l" Apr 16 16:23:48.165414 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:48.165395 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddc36c0b_3388_4a0a_a038_6e0a618d18c2.slice/crio-4e848b382cfd2dc13d958bb24368f77955e549b3a4bc0211eda38f74498fec59 WatchSource:0}: Error finding container 4e848b382cfd2dc13d958bb24368f77955e549b3a4bc0211eda38f74498fec59: Status 404 returned error can't find the container with id 4e848b382cfd2dc13d958bb24368f77955e549b3a4bc0211eda38f74498fec59 Apr 16 16:23:48.176668 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.176652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:23:48.180180 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.180150 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" Apr 16 16:23:48.182299 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:48.182282 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod063cc77b_5a11_4e5a_a733_15acf54a40e8.slice/crio-ef0c2857d27fe8bd1f11e3110e28d6563aeeaa6e416adfd4d7b76520eae67bda WatchSource:0}: Error finding container ef0c2857d27fe8bd1f11e3110e28d6563aeeaa6e416adfd4d7b76520eae67bda: Status 404 returned error can't find the container with id ef0c2857d27fe8bd1f11e3110e28d6563aeeaa6e416adfd4d7b76520eae67bda Apr 16 16:23:48.186782 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:23:48.186764 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7c4234e_0528_40b5_b909_5324aef04be7.slice/crio-eac699bbcbba047e01fe2ca071f5c3410c0efbbd8c524916e3cf11a004a4a898 WatchSource:0}: Error finding container eac699bbcbba047e01fe2ca071f5c3410c0efbbd8c524916e3cf11a004a4a898: Status 404 returned error can't find the container with id eac699bbcbba047e01fe2ca071f5c3410c0efbbd8c524916e3cf11a004a4a898 Apr 16 16:23:48.219432 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.217628 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:48.350264 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.350210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:48.350358 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:48.350326 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:48.350417 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:48.350391 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs podName:6f605cef-2cd0-4480-b5a0-4bb58f196ac7 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:49.350374399 +0000 UTC m=+3.139722511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs") pod "network-metrics-daemon-6dvl8" (UID: "6f605cef-2cd0-4480-b5a0-4bb58f196ac7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:48.518235 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.518069 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:48.551193 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.551164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:48.551348 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:48.551333 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:48.551433 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:48.551353 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:48.551433 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:48.551365 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfkt6 for pod openshift-network-diagnostics/network-check-target-t7shk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:48.551433 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:48.551419 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6 podName:903cab10-206d-4fef-bebf-bbf8db046d19 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:49.551400841 +0000 UTC m=+3.340748948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfkt6" (UniqueName: "kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6") pod "network-check-target-t7shk" (UID: "903cab10-206d-4fef-bebf-bbf8db046d19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:48.788189 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.788155 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:18:47 +0000 UTC" deadline="2027-12-29 17:51:31.073505558 +0000 UTC" Apr 16 16:23:48.788189 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.788189 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14929h27m42.28532171s" Apr 16 16:23:48.873347 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.873289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2nrk" event={"ID":"6e7fea14-3672-4005-bbe8-e59d933d3173","Type":"ContainerStarted","Data":"dff26e9ae72cb5d03201df77babb2095749f4fbd4df5e5a4881ef6f86ee17925"} Apr 16 16:23:48.880092 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.880064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tqwdh" event={"ID":"8593e95d-58e6-43c3-99b0-5582e1e25f39","Type":"ContainerStarted","Data":"674e8b8d0d8c885ffd24bb25e64d28d478fc16b90280f609e7bcc1a6eed42888"} Apr 16 16:23:48.912526 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.912468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" event={"ID":"b7c4234e-0528-40b5-b909-5324aef04be7","Type":"ContainerStarted","Data":"eac699bbcbba047e01fe2ca071f5c3410c0efbbd8c524916e3cf11a004a4a898"} Apr 16 16:23:48.920540 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.920515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" event={"ID":"51f23044-6510-4235-820f-fdca93d4bab6","Type":"ContainerStarted","Data":"45dd9df8217a24f24534a06a2e7e7fba679c95918dbaab88cdb8b047991395bb"} Apr 16 16:23:48.925723 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.925670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zfglr" event={"ID":"32a91c59-c74e-45df-ab79-de8449b1b1e3","Type":"ContainerStarted","Data":"13aca97c95c9cc0c72345963c32342e644060ce14c48626f97a9496f85800c7a"} Apr 16 16:23:48.939052 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.939026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" event={"ID":"d69bb2ba9c62ea055a62e31adcc63bda","Type":"ContainerStarted","Data":"3d887ad27209fc35dca1e1750797368a161d99f5bd94ff4fe0843f1322e59db2"} Apr 16 16:23:48.968902 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.968871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal" event={"ID":"0b233db55a7ade7d393ce1a96715106e","Type":"ContainerStarted","Data":"8d4b6ca1cab356aacd0d5e2099f2f0a4345251bd6f654070bee3f65592bf4d39"} Apr 16 16:23:48.977216 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:48.977191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2jhj4" event={"ID":"063cc77b-5a11-4e5a-a733-15acf54a40e8","Type":"ContainerStarted","Data":"ef0c2857d27fe8bd1f11e3110e28d6563aeeaa6e416adfd4d7b76520eae67bda"} Apr 16 16:23:49.011716 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:49.010072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r5v2l" event={"ID":"ddc36c0b-3388-4a0a-a038-6e0a618d18c2","Type":"ContainerStarted","Data":"4e848b382cfd2dc13d958bb24368f77955e549b3a4bc0211eda38f74498fec59"} Apr 16 16:23:49.013995 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:49.013967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-drsbg" event={"ID":"ee219be2-6e0e-45ac-874e-43970e574181","Type":"ContainerStarted","Data":"5dc739dae3a605394d4ef76f42254e573f39a52b776e55a59bf39087c2aae80f"} Apr 16 16:23:49.027021 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:49.026996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerStarted","Data":"de1b9e82eb5a1d60a905f353c8f70d8b70387dc9b3ab8b8bea0f376df7159207"} Apr 16 16:23:49.358457 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:49.358419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:49.358618 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:49.358569 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:49.358696 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:49.358631 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs podName:6f605cef-2cd0-4480-b5a0-4bb58f196ac7 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:51.358608239 +0000 UTC m=+5.147956359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs") pod "network-metrics-daemon-6dvl8" (UID: "6f605cef-2cd0-4480-b5a0-4bb58f196ac7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:49.561446 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:49.560852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:49.561446 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:49.561003 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:49.561446 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:49.561023 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:49.561446 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:49.561036 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfkt6 for pod openshift-network-diagnostics/network-check-target-t7shk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:49.561446 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:49.561096 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6 podName:903cab10-206d-4fef-bebf-bbf8db046d19 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:51.56107674 +0000 UTC m=+5.350424848 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfkt6" (UniqueName: "kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6") pod "network-check-target-t7shk" (UID: "903cab10-206d-4fef-bebf-bbf8db046d19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:49.788647 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:49.788607 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:18:47 +0000 UTC" deadline="2027-10-22 12:08:47.237075075 +0000 UTC" Apr 16 16:23:49.788647 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:49.788643 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13291h44m57.44843554s" Apr 16 16:23:49.838526 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:49.838499 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:49.838709 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:49.838671 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:23:49.839176 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:49.839157 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:49.839289 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:49.839267 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:23:51.375401 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:51.375368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:51.375865 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:51.375503 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:51.375865 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:51.375563 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs podName:6f605cef-2cd0-4480-b5a0-4bb58f196ac7 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:55.375544518 +0000 UTC m=+9.164892628 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs") pod "network-metrics-daemon-6dvl8" (UID: "6f605cef-2cd0-4480-b5a0-4bb58f196ac7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:51.576710 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:51.576596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:51.577407 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:51.576955 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:51.577407 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:51.576980 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:51.577407 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:51.576993 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfkt6 for pod openshift-network-diagnostics/network-check-target-t7shk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:51.577407 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:51.577057 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6 podName:903cab10-206d-4fef-bebf-bbf8db046d19 nodeName:}" failed. No retries permitted until 2026-04-16 16:23:55.577036899 +0000 UTC m=+9.366385019 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfkt6" (UniqueName: "kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6") pod "network-check-target-t7shk" (UID: "903cab10-206d-4fef-bebf-bbf8db046d19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:51.839456 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:51.839424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:51.839613 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:51.839536 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:23:51.839613 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:51.839585 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:51.839756 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:51.839716 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:23:53.838955 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:53.838920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:53.839348 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:53.839047 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:23:53.839489 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:53.839471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:53.839591 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:53.839572 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:23:55.405077 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:55.404972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:55.405542 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:55.405098 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:55.405542 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:55.405167 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs podName:6f605cef-2cd0-4480-b5a0-4bb58f196ac7 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:03.405147488 +0000 UTC m=+17.194495596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs") pod "network-metrics-daemon-6dvl8" (UID: "6f605cef-2cd0-4480-b5a0-4bb58f196ac7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:55.605959 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:55.605903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:55.606164 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:55.606043 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:55.606164 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:55.606062 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:55.606164 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:55.606075 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfkt6 for pod openshift-network-diagnostics/network-check-target-t7shk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:55.606164 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:55.606139 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6 podName:903cab10-206d-4fef-bebf-bbf8db046d19 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:03.606121496 +0000 UTC m=+17.395469605 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfkt6" (UniqueName: "kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6") pod "network-check-target-t7shk" (UID: "903cab10-206d-4fef-bebf-bbf8db046d19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:55.839286 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:55.839238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:55.839460 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:55.839248 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:55.839460 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:55.839372 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:23:55.839460 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:55.839439 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:23:57.838509 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:57.838477 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:57.838972 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:57.838478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:57.838972 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:57.838574 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:23:57.838972 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:57.838690 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:23:59.839028 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:59.838824 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:23:59.839470 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:23:59.838899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:23:59.839470 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:59.839109 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:23:59.839470 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:23:59.839223 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:01.838634 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:01.838588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:01.839040 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:01.838594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:01.839040 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:01.838719 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:01.839040 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:01.838790 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:03.461914 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:03.461853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:03.462389 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:03.461994 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:03.462389 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:03.462057 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs podName:6f605cef-2cd0-4480-b5a0-4bb58f196ac7 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:19.462038191 +0000 UTC m=+33.251386304 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs") pod "network-metrics-daemon-6dvl8" (UID: "6f605cef-2cd0-4480-b5a0-4bb58f196ac7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:03.663273 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:03.663241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:03.663451 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:03.663403 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:24:03.663451 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:03.663426 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:24:03.663451 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:03.663440 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfkt6 for pod openshift-network-diagnostics/network-check-target-t7shk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:03.663659 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:03.663501 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6 podName:903cab10-206d-4fef-bebf-bbf8db046d19 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:19.663482384 +0000 UTC m=+33.452830497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfkt6" (UniqueName: "kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6") pod "network-check-target-t7shk" (UID: "903cab10-206d-4fef-bebf-bbf8db046d19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:03.839519 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:03.839493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:03.839671 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:03.839589 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:03.839671 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:03.839495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:03.839671 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:03.839670 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:04.151395 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.151324 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-t98gk"] Apr 16 16:24:04.182326 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.182301 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.182455 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:04.182375 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t98gk" podUID="883d7c41-dae5-4d36-b39e-13485dde73de" Apr 16 16:24:04.269049 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.269023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/883d7c41-dae5-4d36-b39e-13485dde73de-kubelet-config\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.269185 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.269080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/883d7c41-dae5-4d36-b39e-13485dde73de-dbus\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.269185 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.269121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.370221 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.370190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/883d7c41-dae5-4d36-b39e-13485dde73de-kubelet-config\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.370393 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.370246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/883d7c41-dae5-4d36-b39e-13485dde73de-dbus\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.370393 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.370273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.370393 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.370331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/883d7c41-dae5-4d36-b39e-13485dde73de-kubelet-config\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.370554 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:04.370402 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:04.370554 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:04.370465 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret podName:883d7c41-dae5-4d36-b39e-13485dde73de nodeName:}" failed. No retries permitted until 2026-04-16 16:24:04.870444195 +0000 UTC m=+18.659792315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret") pod "global-pull-secret-syncer-t98gk" (UID: "883d7c41-dae5-4d36-b39e-13485dde73de") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:04.370554 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.370457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/883d7c41-dae5-4d36-b39e-13485dde73de-dbus\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.874391 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:04.874358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:04.874833 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:04.874477 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:04.874833 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:04.874535 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret podName:883d7c41-dae5-4d36-b39e-13485dde73de nodeName:}" failed. No retries permitted until 2026-04-16 16:24:05.874517916 +0000 UTC m=+19.663866027 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret") pod "global-pull-secret-syncer-t98gk" (UID: "883d7c41-dae5-4d36-b39e-13485dde73de") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:05.838689 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:05.838649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:05.838689 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:05.838664 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:05.838853 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:05.838664 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:05.838853 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:05.838749 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:05.838853 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:05.838838 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:05.839000 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:05.838899 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t98gk" podUID="883d7c41-dae5-4d36-b39e-13485dde73de" Apr 16 16:24:05.882839 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:05.882816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:05.883149 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:05.882948 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:05.883149 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:05.882996 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret podName:883d7c41-dae5-4d36-b39e-13485dde73de nodeName:}" failed. No retries permitted until 2026-04-16 16:24:07.88298288 +0000 UTC m=+21.672330992 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret") pod "global-pull-secret-syncer-t98gk" (UID: "883d7c41-dae5-4d36-b39e-13485dde73de") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:07.066489 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.066464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal" event={"ID":"0b233db55a7ade7d393ce1a96715106e","Type":"ContainerStarted","Data":"ae0a79a8699445485e57a248c053c8ea5d5287a171ebd648d025dffc73c096af"} Apr 16 16:24:07.067832 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.067812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-drsbg" event={"ID":"ee219be2-6e0e-45ac-874e-43970e574181","Type":"ContainerStarted","Data":"113e4e0c544c0fe18f0aa76afae2fbd784fa932cb3bb4c5621386ba633fde4ad"} Apr 16 16:24:07.070175 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.070160 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:24:07.070386 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.070369 2576 generic.go:358] "Generic (PLEG): container finished" podID="c2efb54c-bc99-4126-ac22-6f7a17d6cd42" containerID="9981cceeb9ef82bf9343c4cfd24d4ffd1fd2c9612ff63a5da1c2e769235028b9" exitCode=1 Apr 16 16:24:07.070444 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.070410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerStarted","Data":"63f9188142d7adef7f0660a5f7eb582cfb15669c45e8d356eafd6442af472d5f"} Apr 16 16:24:07.070444 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.070423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerStarted","Data":"0a5b5300e36f55534c299e5ed132c67aaae2a7791845407b1b6c9c3188c6be55"} Apr 16 16:24:07.070444 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.070431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerStarted","Data":"1f4143cc64182dff29c6fa6803b853c7f52073f160da81661f495f1b4f49f1ca"} Apr 16 16:24:07.070444 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.070439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerStarted","Data":"aaaa5a9cb551567799ba5463ff1a2c6e65ad3085fcad2ba5aa3bc9e5558cb705"} Apr 16 16:24:07.070586 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.070447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerDied","Data":"9981cceeb9ef82bf9343c4cfd24d4ffd1fd2c9612ff63a5da1c2e769235028b9"} Apr 16 16:24:07.070586 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.070455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerStarted","Data":"18f5ee5a1cc221ec567add61c1d48ddac345fcc179fb75ccad2e7f9215d46d18"} Apr 16 16:24:07.071694 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.071646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2nrk" event={"ID":"6e7fea14-3672-4005-bbe8-e59d933d3173","Type":"ContainerStarted","Data":"057f4a5fd15c203e9cdb6aee3e4e933f2e62af48da43952d8e6c9645bf43ffab"} Apr 16 16:24:07.081095 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.080479 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-93.ec2.internal" podStartSLOduration=20.080466009 podStartE2EDuration="20.080466009s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:07.079961573 +0000 UTC m=+20.869309696" watchObservedRunningTime="2026-04-16 16:24:07.080466009 +0000 UTC m=+20.869814134" Apr 16 16:24:07.095490 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.095255 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-drsbg" podStartSLOduration=2.004862464 podStartE2EDuration="20.095240149s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="2026-04-16 16:23:48.116427043 +0000 UTC m=+1.905775145" lastFinishedPulling="2026-04-16 16:24:06.206804713 +0000 UTC m=+19.996152830" observedRunningTime="2026-04-16 16:24:07.094581977 +0000 UTC m=+20.883930105" watchObservedRunningTime="2026-04-16 16:24:07.095240149 +0000 UTC m=+20.884588275" Apr 16 16:24:07.108653 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.108614 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x2nrk" podStartSLOduration=2.7171370919999998 podStartE2EDuration="21.108603119s" podCreationTimestamp="2026-04-16 16:23:46 +0000 UTC" firstStartedPulling="2026-04-16 16:23:48.084308803 +0000 UTC m=+1.873656906" lastFinishedPulling="2026-04-16 16:24:06.475774826 +0000 UTC m=+20.265122933" observedRunningTime="2026-04-16 16:24:07.108405731 +0000 UTC m=+20.897753856" watchObservedRunningTime="2026-04-16 16:24:07.108603119 +0000 UTC m=+20.897951244" Apr 16 16:24:07.838579 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.838549 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:07.838765 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.838626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:07.838828 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.838760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:07.838882 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:07.838867 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:07.838933 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:07.838763 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t98gk" podUID="883d7c41-dae5-4d36-b39e-13485dde73de" Apr 16 16:24:07.838988 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:07.838962 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:07.899582 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:07.899564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:07.899694 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:07.899656 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:07.899759 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:07.899713 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret podName:883d7c41-dae5-4d36-b39e-13485dde73de nodeName:}" failed. No retries permitted until 2026-04-16 16:24:11.89969947 +0000 UTC m=+25.689047580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret") pod "global-pull-secret-syncer-t98gk" (UID: "883d7c41-dae5-4d36-b39e-13485dde73de") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:08.020386 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.020367 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:24:08.074489 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.074467 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r5v2l" event={"ID":"ddc36c0b-3388-4a0a-a038-6e0a618d18c2","Type":"ContainerStarted","Data":"9c45c91be2eadd2625401fce4c62750b24bd8457e801971e89438b251413f4ff"} Apr 16 16:24:08.075752 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.075733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tqwdh" event={"ID":"8593e95d-58e6-43c3-99b0-5582e1e25f39","Type":"ContainerStarted","Data":"b0fbc604655d5989720bfec62d3fb73ee7b4f82780db739e453ab1fed39f79a4"} Apr 16 16:24:08.077170 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.077153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" event={"ID":"b7c4234e-0528-40b5-b909-5324aef04be7","Type":"ContainerStarted","Data":"8040c58098701a64d27af883e91837201298d15d41b64533b57b60116284518c"} Apr 16 16:24:08.077247 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.077175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" event={"ID":"b7c4234e-0528-40b5-b909-5324aef04be7","Type":"ContainerStarted","Data":"5ff42620d41f25a952b55675f8495840af910076b79a7fdb06f361265ae061bf"} Apr 16 16:24:08.078360 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.078339 2576 generic.go:358] "Generic (PLEG): container finished" podID="51f23044-6510-4235-820f-fdca93d4bab6" containerID="8fd972907e789fedee4ea3b52172969aab2af50ab0fcd585c687f654bf913ace" exitCode=0 Apr 16 16:24:08.078448 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.078374 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" event={"ID":"51f23044-6510-4235-820f-fdca93d4bab6","Type":"ContainerDied","Data":"8fd972907e789fedee4ea3b52172969aab2af50ab0fcd585c687f654bf913ace"} Apr 16 16:24:08.079608 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.079586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zfglr" event={"ID":"32a91c59-c74e-45df-ab79-de8449b1b1e3","Type":"ContainerStarted","Data":"ecb4646b1c9d0cf8081d127c88ce58805519588dcea1a65525ad8247eeae7d9f"} Apr 16 16:24:08.080846 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.080817 2576 generic.go:358] "Generic (PLEG): container finished" podID="d69bb2ba9c62ea055a62e31adcc63bda" containerID="c5069368a087ee178d38555f2825d020db7f82177e6cdbe1eaceb3b3dbe36f26" exitCode=0 Apr 16 16:24:08.080925 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.080891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" event={"ID":"d69bb2ba9c62ea055a62e31adcc63bda","Type":"ContainerDied","Data":"c5069368a087ee178d38555f2825d020db7f82177e6cdbe1eaceb3b3dbe36f26"} Apr 16 16:24:08.082094 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.082070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2jhj4" event={"ID":"063cc77b-5a11-4e5a-a733-15acf54a40e8","Type":"ContainerStarted","Data":"29f529f6913faf794b41bc949d0a86eec486e1e9abc29c1ec7fa69d572e9c9c8"} Apr 16 16:24:08.089791 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.089758 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-r5v2l" podStartSLOduration=3.051155811 podStartE2EDuration="21.089747963s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="2026-04-16 16:23:48.166794499 +0000 UTC m=+1.956142605" lastFinishedPulling="2026-04-16 16:24:06.205386639 +0000 UTC m=+19.994734757" observedRunningTime="2026-04-16 16:24:08.089469673 +0000 UTC m=+21.878817799" watchObservedRunningTime="2026-04-16 16:24:08.089747963 +0000 UTC m=+21.879096079" Apr 16 16:24:08.117858 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.117826 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tqwdh" podStartSLOduration=3.978348826 podStartE2EDuration="22.117815088s" podCreationTimestamp="2026-04-16 16:23:46 +0000 UTC" firstStartedPulling="2026-04-16 16:23:48.067404162 +0000 UTC m=+1.856752266" lastFinishedPulling="2026-04-16 16:24:06.206870417 +0000 UTC m=+19.996218528" observedRunningTime="2026-04-16 16:24:08.117556427 +0000 UTC m=+21.906904552" watchObservedRunningTime="2026-04-16 16:24:08.117815088 +0000 UTC m=+21.907163212" Apr 16 16:24:08.130132 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.130058 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zfglr" podStartSLOduration=3.058251802 podStartE2EDuration="21.130048585s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="2026-04-16 16:23:48.133198561 +0000 UTC m=+1.922546677" lastFinishedPulling="2026-04-16 16:24:06.204995354 +0000 UTC m=+19.994343460" observedRunningTime="2026-04-16 16:24:08.129496473 +0000 UTC m=+21.918844598" watchObservedRunningTime="2026-04-16 16:24:08.130048585 +0000 UTC m=+21.919396714" Apr 16 16:24:08.159359 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.159329 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2jhj4" podStartSLOduration=3.138199841 podStartE2EDuration="21.159317899s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="2026-04-16 16:23:48.1842454 +0000 UTC m=+1.973593503" lastFinishedPulling="2026-04-16 16:24:06.205363454 +0000 UTC m=+19.994711561" observedRunningTime="2026-04-16 16:24:08.158821602 +0000 UTC m=+21.948169726" watchObservedRunningTime="2026-04-16 16:24:08.159317899 +0000 UTC m=+21.948666058" Apr 16 16:24:08.810055 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.809951 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:24:08.02038232Z","UUID":"87076a71-6437-4d77-84d0-6880a17bdb31","Handler":null,"Name":"","Endpoint":""} Apr 16 16:24:08.811865 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.811841 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:24:08.811865 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:08.811867 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:24:09.087316 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:09.087251 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:24:09.087703 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:09.087634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerStarted","Data":"70dffb481dade55d724ceabdbdeaf396608544d3c97d33a166c509a44e4c045d"} Apr 16 16:24:09.089753 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:09.089733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" event={"ID":"b7c4234e-0528-40b5-b909-5324aef04be7","Type":"ContainerStarted","Data":"35ae7dda2cc089b704d1151af276284c867db8625229ef4b348d77a70426bfa4"} Apr 16 16:24:09.091341 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:09.091317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" event={"ID":"d69bb2ba9c62ea055a62e31adcc63bda","Type":"ContainerStarted","Data":"e5b17c783ba1e01c5a15747bdeb41737c423fb2883fb440aeaa8f95652ee2eac"} Apr 16 16:24:09.112911 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:09.112871 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vf5rw" podStartSLOduration=1.411473562 podStartE2EDuration="22.112858713s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="2026-04-16 16:23:48.187994361 +0000 UTC m=+1.977342470" lastFinishedPulling="2026-04-16 16:24:08.889379513 +0000 UTC m=+22.678727621" observedRunningTime="2026-04-16 16:24:09.112607197 +0000 UTC m=+22.901955321" watchObservedRunningTime="2026-04-16 16:24:09.112858713 +0000 UTC m=+22.902206839" Apr 16 16:24:09.129208 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:09.129168 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-93.ec2.internal" podStartSLOduration=22.129155819 podStartE2EDuration="22.129155819s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:09.128583892 +0000 UTC m=+22.917932020" watchObservedRunningTime="2026-04-16 16:24:09.129155819 +0000 UTC m=+22.918503949" Apr 16 16:24:09.839296 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:09.839269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:09.839462 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:09.839269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:09.839462 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:09.839383 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:09.839462 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:09.839269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:09.839622 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:09.839456 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t98gk" podUID="883d7c41-dae5-4d36-b39e-13485dde73de" Apr 16 16:24:09.839622 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:09.839525 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:11.839314 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:11.839118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:11.839756 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:11.839129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:11.839756 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:11.839403 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:11.839756 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:11.839131 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:11.839756 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:11.839475 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t98gk" podUID="883d7c41-dae5-4d36-b39e-13485dde73de" Apr 16 16:24:11.839756 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:11.839547 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:11.935203 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:11.935177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:11.935342 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:11.935301 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:11.935395 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:11.935350 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret podName:883d7c41-dae5-4d36-b39e-13485dde73de nodeName:}" failed. No retries permitted until 2026-04-16 16:24:19.935335656 +0000 UTC m=+33.724683764 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret") pod "global-pull-secret-syncer-t98gk" (UID: "883d7c41-dae5-4d36-b39e-13485dde73de") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:24:12.245207 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:12.245176 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:24:12.245905 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:12.245881 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:24:13.099684 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.099647 2576 generic.go:358] "Generic (PLEG): container finished" podID="51f23044-6510-4235-820f-fdca93d4bab6" containerID="d35021fafccaef565baf89347a34463051cac1634b1578e0f84e8b7f20968181" exitCode=0 Apr 16 16:24:13.100454 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.099735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" event={"ID":"51f23044-6510-4235-820f-fdca93d4bab6","Type":"ContainerDied","Data":"d35021fafccaef565baf89347a34463051cac1634b1578e0f84e8b7f20968181"} Apr 16 16:24:13.102648 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.102631 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:24:13.103046 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.103026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerStarted","Data":"a85287a2922d9858207dad276f4581dfe463a23650e56de0bc8d14134863c1a0"} Apr 16 16:24:13.103329 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.103314 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:24:13.103406 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.103337 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:24:13.103406 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.103352 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:24:13.103406 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.103364 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:24:13.103555 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.103491 2576 scope.go:117] "RemoveContainer" containerID="9981cceeb9ef82bf9343c4cfd24d4ffd1fd2c9612ff63a5da1c2e769235028b9" Apr 16 16:24:13.104304 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.104287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2jhj4" Apr 16 16:24:13.118169 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.118153 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:24:13.118965 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.118949 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:24:13.839629 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.839302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:13.839629 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.839372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:13.839810 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:13.839643 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:13.839810 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:13.839302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:13.839810 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:13.839773 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:13.839957 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:13.839855 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t98gk" podUID="883d7c41-dae5-4d36-b39e-13485dde73de" Apr 16 16:24:14.024900 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.024875 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t98gk"] Apr 16 16:24:14.028154 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.028132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t7shk"] Apr 16 16:24:14.030873 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.030854 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6dvl8"] Apr 16 16:24:14.107687 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.107599 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:24:14.107987 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.107927 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" event={"ID":"c2efb54c-bc99-4126-ac22-6f7a17d6cd42","Type":"ContainerStarted","Data":"4dbf2868adf5157c2189587efd4f299dbd8e748be5b76aaaff3c5ef81ba78e1f"} Apr 16 16:24:14.109760 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.109733 2576 generic.go:358] "Generic (PLEG): container finished" podID="51f23044-6510-4235-820f-fdca93d4bab6" containerID="62ca6ef051091f1935c2874c46479e85b7658499b9e26e8493714f6af99d1041" exitCode=0 Apr 16 16:24:14.109858 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.109761 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" event={"ID":"51f23044-6510-4235-820f-fdca93d4bab6","Type":"ContainerDied","Data":"62ca6ef051091f1935c2874c46479e85b7658499b9e26e8493714f6af99d1041"} Apr 16 16:24:14.109858 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.109842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:14.109858 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.109847 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:14.109990 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:14.109911 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t98gk" podUID="883d7c41-dae5-4d36-b39e-13485dde73de" Apr 16 16:24:14.109990 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.109959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:14.110086 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:14.110065 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:14.110284 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:14.110265 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:14.139043 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:14.138996 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" podStartSLOduration=9.02499548 podStartE2EDuration="27.138981852s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="2026-04-16 16:23:48.104213292 +0000 UTC m=+1.893561395" lastFinishedPulling="2026-04-16 16:24:06.21819965 +0000 UTC m=+20.007547767" observedRunningTime="2026-04-16 16:24:14.137804721 +0000 UTC m=+27.927152846" watchObservedRunningTime="2026-04-16 16:24:14.138981852 +0000 UTC m=+27.928329979" Apr 16 16:24:15.113491 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:15.113459 2576 generic.go:358] "Generic (PLEG): container finished" podID="51f23044-6510-4235-820f-fdca93d4bab6" containerID="b941eff584594b9d0b221b44e9190785e47e7f7af97de811e056eb363ce9bd55" exitCode=0 Apr 16 16:24:15.113874 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:15.113533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" event={"ID":"51f23044-6510-4235-820f-fdca93d4bab6","Type":"ContainerDied","Data":"b941eff584594b9d0b221b44e9190785e47e7f7af97de811e056eb363ce9bd55"} Apr 16 16:24:15.839161 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:15.839089 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:15.839161 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:15.839138 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:15.839378 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:15.839226 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:15.839378 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:15.839272 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:15.839378 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:15.839326 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t98gk" podUID="883d7c41-dae5-4d36-b39e-13485dde73de" Apr 16 16:24:15.839521 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:15.839381 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:17.839505 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:17.839474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:17.839939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:17.839475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:17.839939 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:17.839579 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t98gk" podUID="883d7c41-dae5-4d36-b39e-13485dde73de" Apr 16 16:24:17.839939 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:17.839702 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6dvl8" podUID="6f605cef-2cd0-4480-b5a0-4bb58f196ac7" Apr 16 16:24:17.839939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:17.839475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:17.839939 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:17.839784 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t7shk" podUID="903cab10-206d-4fef-bebf-bbf8db046d19" Apr 16 16:24:19.494920 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.494887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:19.495366 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.495056 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:19.495366 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.495128 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs podName:6f605cef-2cd0-4480-b5a0-4bb58f196ac7 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:51.495106037 +0000 UTC m=+65.284454167 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs") pod "network-metrics-daemon-6dvl8" (UID: "6f605cef-2cd0-4480-b5a0-4bb58f196ac7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:24:19.506442 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.506419 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-93.ec2.internal" event="NodeReady" Apr 16 16:24:19.506568 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.506529 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:24:19.545421 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.545397 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6"] Apr 16 16:24:19.580979 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.580950 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86cd4845fd-xltjh"] Apr 16 16:24:19.581127 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.581103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.583809 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.583788 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 16:24:19.583959 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.583827 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.584063 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.583837 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.584063 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.583906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bwx75\"" Apr 16 16:24:19.584258 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.584240 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 16:24:19.596632 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.596201 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-6szqp"] Apr 16 16:24:19.629050 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.629024 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn"] Apr 16 16:24:19.629163 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.629075 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.629222 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.629169 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.632361 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.632201 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.632361 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.632255 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 16:24:19.632361 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.632288 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:24:19.632361 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.632335 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:24:19.632619 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.632540 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 16:24:19.632619 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.632566 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-52wm9\"" Apr 16 16:24:19.632726 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.632633 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nvqpx\"" Apr 16 16:24:19.632896 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.632876 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.633179 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.633157 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:24:19.643619 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.643594 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86"] Apr 16 16:24:19.643780 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.643763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:19.646589 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.646572 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-b8zw7\"" Apr 16 16:24:19.647084 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.647067 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.647364 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.647345 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 16:24:19.647450 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.647345 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.647450 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.647351 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 16:24:19.651087 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.651050 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:24:19.651343 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.651323 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 16:24:19.660632 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.660607 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48"] Apr 16 16:24:19.660761 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.660747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.664279 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.664245 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 16:24:19.664374 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.664308 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.664374 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.664263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-ttbvb\"" Apr 16 16:24:19.664374 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.664359 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.664374 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.664368 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 16:24:19.684548 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.684531 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-pxgkx"] Apr 16 16:24:19.684702 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.684687 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48" Apr 16 16:24:19.687343 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.687320 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.687443 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.687427 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.687711 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.687693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-qh2zf\"" Apr 16 16:24:19.696814 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.696795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06b7a51-b7d8-47f8-ab24-e49d59b3cdad-config\") pod \"service-ca-operator-69965bb79d-kklp6\" (UID: \"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.696918 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.696829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.696918 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.696901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-trusted-ca\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.697033 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.696934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-bound-sa-token\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.697033 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.696966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:19.697033 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.696986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-image-registry-private-configuration\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.697033 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.697008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f06b7a51-b7d8-47f8-ab24-e49d59b3cdad-serving-cert\") pod \"service-ca-operator-69965bb79d-kklp6\" (UID: \"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.697222 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.697039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f15b3f4f-12e6-4033-8a06-87894739df95-ca-trust-extracted\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.697222 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.697066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z65k\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-kube-api-access-6z65k\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.697222 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.697085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp6kw\" (UniqueName: \"kubernetes.io/projected/f06b7a51-b7d8-47f8-ab24-e49d59b3cdad-kube-api-access-mp6kw\") pod \"service-ca-operator-69965bb79d-kklp6\" (UID: \"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.697222 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.697095 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:24:19.697222 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.697111 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:24:19.697222 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.697121 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cfkt6 for pod openshift-network-diagnostics/network-check-target-t7shk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:19.697222 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.697137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-registry-certificates\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.697222 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.697153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-installation-pull-secrets\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.697222 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.697172 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6 podName:903cab10-206d-4fef-bebf-bbf8db046d19 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:51.697156873 +0000 UTC m=+65.486504988 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cfkt6" (UniqueName: "kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6") pod "network-check-target-t7shk" (UID: "903cab10-206d-4fef-bebf-bbf8db046d19") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:24:19.700248 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.700229 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp"] Apr 16 16:24:19.700389 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.700369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.702848 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.702830 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 16:24:19.702962 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.702903 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.703199 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.703184 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 16:24:19.703300 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.703215 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-cpjqx\"" Apr 16 16:24:19.703362 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.703328 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.710516 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.710496 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5dc6546dd6-bm7n7"] Apr 16 16:24:19.710832 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.710814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:19.714718 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.714700 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-lkv92\"" Apr 16 16:24:19.714843 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.714720 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 16:24:19.714912 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.714770 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.714967 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.714799 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.716514 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.716498 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 16:24:19.729668 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.729649 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6"] Apr 16 16:24:19.729668 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.729671 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-mht57"] Apr 16 16:24:19.729810 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.729782 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:19.732345 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.732311 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.732574 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.732540 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 16:24:19.732833 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.732808 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.732833 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.732823 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 16:24:19.732961 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.732853 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 16:24:19.732961 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.732813 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 16:24:19.733129 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.733108 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-z4m9w\"" Apr 16 16:24:19.738293 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.738276 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-6szqp"] Apr 16 16:24:19.738293 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.738294 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn"] Apr 16 16:24:19.738428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.738304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86cd4845fd-xltjh"] Apr 16 16:24:19.738428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.738312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48"] Apr 16 16:24:19.738428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.738319 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-pxgkx"] Apr 16 16:24:19.738428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.738328 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5wlnl"] Apr 16 16:24:19.738428 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.738411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mht57" Apr 16 16:24:19.740932 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.740912 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.741146 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.741131 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.741211 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.741146 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-cgcjs\"" Apr 16 16:24:19.754048 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.753961 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5dc6546dd6-bm7n7"] Apr 16 16:24:19.754048 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.753984 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-mht57"] Apr 16 16:24:19.754048 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.754009 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86"] Apr 16 16:24:19.754048 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.754020 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp"] Apr 16 16:24:19.754048 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.754032 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5wlnl"] Apr 16 16:24:19.754048 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.754052 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nnptv"] Apr 16 16:24:19.754380 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.754076 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:19.756337 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.756322 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-psw85\"" Apr 16 16:24:19.756430 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.756390 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:24:19.756601 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.756501 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:24:19.768733 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.768716 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nnptv"] Apr 16 16:24:19.768839 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.768801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:19.771358 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.771322 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:24:19.771448 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.771395 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:24:19.771448 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.771418 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hp8vh\"" Apr 16 16:24:19.771562 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.771547 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:24:19.797776 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.797748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06b7a51-b7d8-47f8-ab24-e49d59b3cdad-config\") pod \"service-ca-operator-69965bb79d-kklp6\" (UID: \"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.797860 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.797788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.797860 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.797810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f06b7a51-b7d8-47f8-ab24-e49d59b3cdad-serving-cert\") pod \"service-ca-operator-69965bb79d-kklp6\" (UID: \"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.797860 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.797842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:19.798019 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.797871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68jvv\" (UniqueName: \"kubernetes.io/projected/bfa69e98-1a05-4478-9f56-e8ca514789be-kube-api-access-68jvv\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.798019 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.797897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gmj2\" (UniqueName: \"kubernetes.io/projected/2e60d773-ddaa-48ec-b63d-69179db32795-kube-api-access-7gmj2\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.798019 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.797940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e60d773-ddaa-48ec-b63d-69179db32795-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.798019 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.797971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa69e98-1a05-4478-9f56-e8ca514789be-config\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.798019 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.797999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e60d773-ddaa-48ec-b63d-69179db32795-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.798235 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e60d773-ddaa-48ec-b63d-69179db32795-serving-cert\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.798235 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tggj2\" (UniqueName: \"kubernetes.io/projected/7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22-kube-api-access-tggj2\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6rs86\" (UID: \"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.798235 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.798112 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:19.798235 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.798133 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86cd4845fd-xltjh: secret "image-registry-tls" not found Apr 16 16:24:19.798235 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z65k\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-kube-api-access-6z65k\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.798235 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.798221 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls podName:f15b3f4f-12e6-4033-8a06-87894739df95 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:20.298190551 +0000 UTC m=+34.087538655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls") pod "image-registry-86cd4845fd-xltjh" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95") : secret "image-registry-tls" not found Apr 16 16:24:19.798532 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:19.798532 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06b7a51-b7d8-47f8-ab24-e49d59b3cdad-config\") pod \"service-ca-operator-69965bb79d-kklp6\" (UID: \"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.798532 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798423 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdlmp\" (UniqueName: \"kubernetes.io/projected/4b370c20-7985-47c9-b2b1-e685ab180a6e-kube-api-access-rdlmp\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:19.798532 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-registry-certificates\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.798532 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-installation-pull-secrets\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.798532 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e60d773-ddaa-48ec-b63d-69179db32795-tmp\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jlws\" (UniqueName: \"kubernetes.io/projected/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-kube-api-access-7jlws\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6rs86\" (UID: \"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-bound-sa-token\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-trusted-ca\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798710 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa69e98-1a05-4478-9f56-e8ca514789be-serving-cert\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798735 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9cxm\" (UniqueName: \"kubernetes.io/projected/ad1a58a4-dd90-4915-8c28-27977a5f2692-kube-api-access-j9cxm\") pod \"volume-data-source-validator-7d955d5dd4-7ql48\" (UID: \"ad1a58a4-dd90-4915-8c28-27977a5f2692\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-image-registry-private-configuration\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfa69e98-1a05-4478-9f56-e8ca514789be-trusted-ca\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f15b3f4f-12e6-4033-8a06-87894739df95-ca-trust-extracted\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.798842 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp6kw\" (UniqueName: \"kubernetes.io/projected/f06b7a51-b7d8-47f8-ab24-e49d59b3cdad-kube-api-access-mp6kw\") pod \"service-ca-operator-69965bb79d-kklp6\" (UID: \"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.799284 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6rs86\" (UID: \"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.799284 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.798891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2e60d773-ddaa-48ec-b63d-69179db32795-snapshots\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.799358 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.799294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f15b3f4f-12e6-4033-8a06-87894739df95-ca-trust-extracted\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.799497 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.799451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-registry-certificates\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.800164 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.800114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-trusted-ca\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.803025 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.802890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f06b7a51-b7d8-47f8-ab24-e49d59b3cdad-serving-cert\") pod \"service-ca-operator-69965bb79d-kklp6\" (UID: \"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.803099 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.802966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-installation-pull-secrets\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.803099 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.802979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-image-registry-private-configuration\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.807950 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.807924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-bound-sa-token\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.808054 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.808033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp6kw\" (UniqueName: \"kubernetes.io/projected/f06b7a51-b7d8-47f8-ab24-e49d59b3cdad-kube-api-access-mp6kw\") pod \"service-ca-operator-69965bb79d-kklp6\" (UID: \"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.808178 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.808161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z65k\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-kube-api-access-6z65k\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:19.839420 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.839394 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:19.839523 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.839399 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:19.839610 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.839405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:19.842098 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.842073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:24:19.842206 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.842120 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f6w5w\"" Apr 16 16:24:19.842206 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.842120 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w5tzs\"" Apr 16 16:24:19.842357 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.842343 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:24:19.893608 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.893591 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" Apr 16 16:24:19.899445 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdlmp\" (UniqueName: \"kubernetes.io/projected/4b370c20-7985-47c9-b2b1-e685ab180a6e-kube-api-access-rdlmp\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:19.899528 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fjj\" (UniqueName: \"kubernetes.io/projected/22155b83-35e6-40ea-8b7e-1fc752b875eb-kube-api-access-p6fjj\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:19.899528 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e60d773-ddaa-48ec-b63d-69179db32795-tmp\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.899528 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jlws\" (UniqueName: \"kubernetes.io/projected/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-kube-api-access-7jlws\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:19.899668 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6rs86\" (UID: \"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.899668 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899664 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:19.899799 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:19.899799 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa69e98-1a05-4478-9f56-e8ca514789be-serving-cert\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.899799 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9cxm\" (UniqueName: \"kubernetes.io/projected/ad1a58a4-dd90-4915-8c28-27977a5f2692-kube-api-access-j9cxm\") pod \"volume-data-source-validator-7d955d5dd4-7ql48\" (UID: \"ad1a58a4-dd90-4915-8c28-27977a5f2692\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48" Apr 16 16:24:19.899936 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:19.899936 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:19.899936 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.899882 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:19.899936 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2e60d773-ddaa-48ec-b63d-69179db32795-tmp\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.899936 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfa69e98-1a05-4478-9f56-e8ca514789be-trusted-ca\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.899936 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.899935 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls podName:4b370c20-7985-47c9-b2b1-e685ab180a6e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:20.399916379 +0000 UTC m=+34.189264495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls") pod "cluster-samples-operator-667775844f-vhnrp" (UID: "4b370c20-7985-47c9-b2b1-e685ab180a6e") : secret "samples-operator-tls" not found Apr 16 16:24:19.900189 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.899969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-config-volume\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:19.900189 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900051 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-default-certificate\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:19.900189 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6rs86\" (UID: \"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.900189 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-stats-auth\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:19.900189 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2e60d773-ddaa-48ec-b63d-69179db32795-snapshots\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.900433 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ng2c\" (UniqueName: \"kubernetes.io/projected/b4dd6fba-5a94-4e3e-92e2-74610c8a58bf-kube-api-access-6ng2c\") pod \"network-check-source-7b678d77c7-mht57\" (UID: \"b4dd6fba-5a94-4e3e-92e2-74610c8a58bf\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mht57" Apr 16 16:24:19.900433 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prh58\" (UniqueName: \"kubernetes.io/projected/ddb0d5c7-7432-473d-a4e0-7822ea15651e-kube-api-access-prh58\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:19.900433 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-tmp-dir\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:19.900433 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:19.900433 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68jvv\" (UniqueName: \"kubernetes.io/projected/bfa69e98-1a05-4478-9f56-e8ca514789be-kube-api-access-68jvv\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.900789 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gmj2\" (UniqueName: \"kubernetes.io/projected/2e60d773-ddaa-48ec-b63d-69179db32795-kube-api-access-7gmj2\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.900853 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e60d773-ddaa-48ec-b63d-69179db32795-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.900853 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa69e98-1a05-4478-9f56-e8ca514789be-config\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.900953 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/2e60d773-ddaa-48ec-b63d-69179db32795-snapshots\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.900953 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e60d773-ddaa-48ec-b63d-69179db32795-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.901065 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.900968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e60d773-ddaa-48ec-b63d-69179db32795-serving-cert\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.901065 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.901004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tggj2\" (UniqueName: \"kubernetes.io/projected/7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22-kube-api-access-tggj2\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6rs86\" (UID: \"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.901065 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.901044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:19.901188 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.901068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gp55\" (UniqueName: \"kubernetes.io/projected/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-kube-api-access-4gp55\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:19.902032 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.901892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bfa69e98-1a05-4478-9f56-e8ca514789be-trusted-ca\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.902331 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.902262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e60d773-ddaa-48ec-b63d-69179db32795-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.906038 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.902609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:19.906038 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.902627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e60d773-ddaa-48ec-b63d-69179db32795-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.906038 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.902738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:19.906038 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.902909 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:19.906038 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.902942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfa69e98-1a05-4478-9f56-e8ca514789be-config\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.906038 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:19.902980 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls podName:476e7e41-cee5-4dbf-bd5f-7d3a9ce62024 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:20.402963391 +0000 UTC m=+34.192311499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-2kdhn" (UID: "476e7e41-cee5-4dbf-bd5f-7d3a9ce62024") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:19.906038 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.903397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfa69e98-1a05-4478-9f56-e8ca514789be-serving-cert\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.906038 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.905199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6rs86\" (UID: \"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.906038 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.905338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e60d773-ddaa-48ec-b63d-69179db32795-serving-cert\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.907281 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.907257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6rs86\" (UID: \"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.911091 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.910980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68jvv\" (UniqueName: \"kubernetes.io/projected/bfa69e98-1a05-4478-9f56-e8ca514789be-kube-api-access-68jvv\") pod \"console-operator-d87b8d5fc-6szqp\" (UID: \"bfa69e98-1a05-4478-9f56-e8ca514789be\") " pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.911702 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.911656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9cxm\" (UniqueName: \"kubernetes.io/projected/ad1a58a4-dd90-4915-8c28-27977a5f2692-kube-api-access-j9cxm\") pod \"volume-data-source-validator-7d955d5dd4-7ql48\" (UID: \"ad1a58a4-dd90-4915-8c28-27977a5f2692\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48" Apr 16 16:24:19.911964 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.911943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tggj2\" (UniqueName: \"kubernetes.io/projected/7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22-kube-api-access-tggj2\") pod \"kube-storage-version-migrator-operator-756bb7d76f-6rs86\" (UID: \"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.912002 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.911984 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jlws\" (UniqueName: \"kubernetes.io/projected/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-kube-api-access-7jlws\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:19.912922 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.912901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdlmp\" (UniqueName: \"kubernetes.io/projected/4b370c20-7985-47c9-b2b1-e685ab180a6e-kube-api-access-rdlmp\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:19.913207 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.913186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gmj2\" (UniqueName: \"kubernetes.io/projected/2e60d773-ddaa-48ec-b63d-69179db32795-kube-api-access-7gmj2\") pod \"insights-operator-5785d4fcdd-pxgkx\" (UID: \"2e60d773-ddaa-48ec-b63d-69179db32795\") " pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:19.947220 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.947203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:19.970176 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.970151 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" Apr 16 16:24:19.993669 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:19.993651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48" Apr 16 16:24:20.003345 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prh58\" (UniqueName: \"kubernetes.io/projected/ddb0d5c7-7432-473d-a4e0-7822ea15651e-kube-api-access-prh58\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:20.003423 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-tmp-dir\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:20.003423 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:20.003536 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gp55\" (UniqueName: \"kubernetes.io/projected/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-kube-api-access-4gp55\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:20.003536 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fjj\" (UniqueName: \"kubernetes.io/projected/22155b83-35e6-40ea-8b7e-1fc752b875eb-kube-api-access-p6fjj\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.003536 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:20.003696 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:20.003696 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003645 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.003696 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.003849 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-config-volume\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:20.003849 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-default-certificate\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.003849 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.003738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-tmp-dir\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:20.003849 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.003816 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:24:20.003849 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.003845 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:24:20.004104 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.003887 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert podName:ddb0d5c7-7432-473d-a4e0-7822ea15651e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:20.503869836 +0000 UTC m=+34.293217945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert") pod "ingress-canary-nnptv" (UID: "ddb0d5c7-7432-473d-a4e0-7822ea15651e") : secret "canary-serving-cert" not found Apr 16 16:24:20.004104 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.003906 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls podName:bfdb87ca-0166-4b3f-83a8-f352f00ae0c6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:20.503896837 +0000 UTC m=+34.293244948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls") pod "dns-default-5wlnl" (UID: "bfdb87ca-0166-4b3f-83a8-f352f00ae0c6") : secret "dns-default-metrics-tls" not found Apr 16 16:24:20.004104 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.003973 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:20.004104 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.004026 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:20.504014344 +0000 UTC m=+34.293362465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : secret "router-metrics-certs-default" not found Apr 16 16:24:20.004104 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.004067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-stats-auth\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.004370 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.004108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ng2c\" (UniqueName: \"kubernetes.io/projected/b4dd6fba-5a94-4e3e-92e2-74610c8a58bf-kube-api-access-6ng2c\") pod \"network-check-source-7b678d77c7-mht57\" (UID: \"b4dd6fba-5a94-4e3e-92e2-74610c8a58bf\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mht57" Apr 16 16:24:20.004370 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.004330 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:20.504312525 +0000 UTC m=+34.293660661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:20.005406 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.005344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-config-volume\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:20.006446 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.006424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-default-certificate\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.006552 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.006536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/883d7c41-dae5-4d36-b39e-13485dde73de-original-pull-secret\") pod \"global-pull-secret-syncer-t98gk\" (UID: \"883d7c41-dae5-4d36-b39e-13485dde73de\") " pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:20.006624 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.006606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-stats-auth\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.012064 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.012041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" Apr 16 16:24:20.013013 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.012970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prh58\" (UniqueName: \"kubernetes.io/projected/ddb0d5c7-7432-473d-a4e0-7822ea15651e-kube-api-access-prh58\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:20.013108 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.013081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fjj\" (UniqueName: \"kubernetes.io/projected/22155b83-35e6-40ea-8b7e-1fc752b875eb-kube-api-access-p6fjj\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.014928 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.014893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ng2c\" (UniqueName: \"kubernetes.io/projected/b4dd6fba-5a94-4e3e-92e2-74610c8a58bf-kube-api-access-6ng2c\") pod \"network-check-source-7b678d77c7-mht57\" (UID: \"b4dd6fba-5a94-4e3e-92e2-74610c8a58bf\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mht57" Apr 16 16:24:20.015103 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.015085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gp55\" (UniqueName: \"kubernetes.io/projected/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-kube-api-access-4gp55\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:20.047763 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.047741 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mht57" Apr 16 16:24:20.157692 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.157652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t98gk" Apr 16 16:24:20.307932 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.307850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:20.308082 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.307999 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:20.308082 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.308020 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86cd4845fd-xltjh: secret "image-registry-tls" not found Apr 16 16:24:20.308082 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.308081 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls podName:f15b3f4f-12e6-4033-8a06-87894739df95 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:21.308062918 +0000 UTC m=+35.097411031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls") pod "image-registry-86cd4845fd-xltjh" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95") : secret "image-registry-tls" not found Apr 16 16:24:20.409251 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.409217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:20.409404 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.409306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:20.409404 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.409380 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:20.409516 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.409447 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:20.409516 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.409466 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls podName:476e7e41-cee5-4dbf-bd5f-7d3a9ce62024 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:21.409444705 +0000 UTC m=+35.198792816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-2kdhn" (UID: "476e7e41-cee5-4dbf-bd5f-7d3a9ce62024") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:20.409620 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.409535 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls podName:4b370c20-7985-47c9-b2b1-e685ab180a6e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:21.409488176 +0000 UTC m=+35.198836287 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls") pod "cluster-samples-operator-667775844f-vhnrp" (UID: "4b370c20-7985-47c9-b2b1-e685ab180a6e") : secret "samples-operator-tls" not found Apr 16 16:24:20.510791 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.510757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.510870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.510901 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.510941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.510963 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:21.510946987 +0000 UTC m=+35.300295090 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : secret "router-metrics-certs-default" not found Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.511010 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.511023 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.511027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.511064 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls podName:bfdb87ca-0166-4b3f-83a8-f352f00ae0c6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:21.511050085 +0000 UTC m=+35.300398195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls") pod "dns-default-5wlnl" (UID: "bfdb87ca-0166-4b3f-83a8-f352f00ae0c6") : secret "dns-default-metrics-tls" not found Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.511081 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert podName:ddb0d5c7-7432-473d-a4e0-7822ea15651e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:21.511072209 +0000 UTC m=+35.300420320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert") pod "ingress-canary-nnptv" (UID: "ddb0d5c7-7432-473d-a4e0-7822ea15651e") : secret "canary-serving-cert" not found Apr 16 16:24:20.511201 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:20.511133 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:21.511120531 +0000 UTC m=+35.300468636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:20.868327 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.868301 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t98gk"] Apr 16 16:24:20.878230 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.878205 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48"] Apr 16 16:24:20.895480 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.895435 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-mht57"] Apr 16 16:24:20.896848 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.896820 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86"] Apr 16 16:24:20.898512 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.898495 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6"] Apr 16 16:24:20.899312 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.899275 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-6szqp"] Apr 16 16:24:20.912667 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:20.912645 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-pxgkx"] Apr 16 16:24:20.948411 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:20.948380 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod883d7c41_dae5_4d36_b39e_13485dde73de.slice/crio-80c421264580713b6623cd8214e410281b6cd2ee2d3ac379f79f7b821cf17e62 WatchSource:0}: Error finding container 80c421264580713b6623cd8214e410281b6cd2ee2d3ac379f79f7b821cf17e62: Status 404 returned error can't find the container with id 80c421264580713b6623cd8214e410281b6cd2ee2d3ac379f79f7b821cf17e62 Apr 16 16:24:20.949315 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:20.949287 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad1a58a4_dd90_4915_8c28_27977a5f2692.slice/crio-525f5140d0088d7c35c48cd70edcac9320c8706dabf9a93a3c61f44933dcc813 WatchSource:0}: Error finding container 525f5140d0088d7c35c48cd70edcac9320c8706dabf9a93a3c61f44933dcc813: Status 404 returned error can't find the container with id 525f5140d0088d7c35c48cd70edcac9320c8706dabf9a93a3c61f44933dcc813 Apr 16 16:24:20.950181 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:20.950017 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f8bcfec_4683_4c6e_9b31_ee63b9f0ff22.slice/crio-7bcd1d0881cb1ee3c7171f770d514a529e7ac64e51fff1606ce0d4e6827172f8 WatchSource:0}: Error finding container 7bcd1d0881cb1ee3c7171f770d514a529e7ac64e51fff1606ce0d4e6827172f8: Status 404 returned error can't find the container with id 7bcd1d0881cb1ee3c7171f770d514a529e7ac64e51fff1606ce0d4e6827172f8 Apr 16 16:24:20.950786 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:20.950760 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4dd6fba_5a94_4e3e_92e2_74610c8a58bf.slice/crio-c41073f894f4aefb0ef55ab590ab30d35d80e7ba891f70db9f08a728ac2185bb WatchSource:0}: Error finding container c41073f894f4aefb0ef55ab590ab30d35d80e7ba891f70db9f08a728ac2185bb: Status 404 returned error can't find the container with id c41073f894f4aefb0ef55ab590ab30d35d80e7ba891f70db9f08a728ac2185bb Apr 16 16:24:20.952625 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:20.952436 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa69e98_1a05_4478_9f56_e8ca514789be.slice/crio-20945651c779069fcb19c23dc73413236f0c792bc38de3f04eee801743f38e1c WatchSource:0}: Error finding container 20945651c779069fcb19c23dc73413236f0c792bc38de3f04eee801743f38e1c: Status 404 returned error can't find the container with id 20945651c779069fcb19c23dc73413236f0c792bc38de3f04eee801743f38e1c Apr 16 16:24:20.953085 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:20.952990 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf06b7a51_b7d8_47f8_ab24_e49d59b3cdad.slice/crio-27880d3d8a6bcde19c77002d6c41041e3939f542b54bd1c12d08e26a49dcaa12 WatchSource:0}: Error finding container 27880d3d8a6bcde19c77002d6c41041e3939f542b54bd1c12d08e26a49dcaa12: Status 404 returned error can't find the container with id 27880d3d8a6bcde19c77002d6c41041e3939f542b54bd1c12d08e26a49dcaa12 Apr 16 16:24:20.962865 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:20.962841 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e60d773_ddaa_48ec_b63d_69179db32795.slice/crio-647eac4a00d04e141f58bee48820f321ad2e6e553e38897cbc0ddc3dfdec0ca7 WatchSource:0}: Error finding container 647eac4a00d04e141f58bee48820f321ad2e6e553e38897cbc0ddc3dfdec0ca7: Status 404 returned error can't find the container with id 647eac4a00d04e141f58bee48820f321ad2e6e553e38897cbc0ddc3dfdec0ca7 Apr 16 16:24:21.123021 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.122847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" event={"ID":"bfa69e98-1a05-4478-9f56-e8ca514789be","Type":"ContainerStarted","Data":"20945651c779069fcb19c23dc73413236f0c792bc38de3f04eee801743f38e1c"} Apr 16 16:24:21.123811 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.123778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t98gk" event={"ID":"883d7c41-dae5-4d36-b39e-13485dde73de","Type":"ContainerStarted","Data":"80c421264580713b6623cd8214e410281b6cd2ee2d3ac379f79f7b821cf17e62"} Apr 16 16:24:21.124751 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.124727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mht57" event={"ID":"b4dd6fba-5a94-4e3e-92e2-74610c8a58bf","Type":"ContainerStarted","Data":"c41073f894f4aefb0ef55ab590ab30d35d80e7ba891f70db9f08a728ac2185bb"} Apr 16 16:24:21.125590 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.125573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" event={"ID":"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22","Type":"ContainerStarted","Data":"7bcd1d0881cb1ee3c7171f770d514a529e7ac64e51fff1606ce0d4e6827172f8"} Apr 16 16:24:21.126444 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.126427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48" event={"ID":"ad1a58a4-dd90-4915-8c28-27977a5f2692","Type":"ContainerStarted","Data":"525f5140d0088d7c35c48cd70edcac9320c8706dabf9a93a3c61f44933dcc813"} Apr 16 16:24:21.127235 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.127218 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" event={"ID":"2e60d773-ddaa-48ec-b63d-69179db32795","Type":"ContainerStarted","Data":"647eac4a00d04e141f58bee48820f321ad2e6e553e38897cbc0ddc3dfdec0ca7"} Apr 16 16:24:21.128061 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.128042 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" event={"ID":"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad","Type":"ContainerStarted","Data":"27880d3d8a6bcde19c77002d6c41041e3939f542b54bd1c12d08e26a49dcaa12"} Apr 16 16:24:21.318121 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.318089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:21.318240 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.318221 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:21.318306 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.318241 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86cd4845fd-xltjh: secret "image-registry-tls" not found Apr 16 16:24:21.318306 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.318292 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls podName:f15b3f4f-12e6-4033-8a06-87894739df95 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:23.318278214 +0000 UTC m=+37.107626317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls") pod "image-registry-86cd4845fd-xltjh" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95") : secret "image-registry-tls" not found Apr 16 16:24:21.419239 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.419180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:21.419359 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.419306 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:21.419359 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.419340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:21.419479 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.419368 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls podName:4b370c20-7985-47c9-b2b1-e685ab180a6e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:23.419353247 +0000 UTC m=+37.208701350 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls") pod "cluster-samples-operator-667775844f-vhnrp" (UID: "4b370c20-7985-47c9-b2b1-e685ab180a6e") : secret "samples-operator-tls" not found Apr 16 16:24:21.419479 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.419439 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:21.419593 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.419500 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls podName:476e7e41-cee5-4dbf-bd5f-7d3a9ce62024 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:23.41948147 +0000 UTC m=+37.208829581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-2kdhn" (UID: "476e7e41-cee5-4dbf-bd5f-7d3a9ce62024") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:21.520405 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.520375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.520415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.520507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.520532 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:21.520571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.520582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:23.520566261 +0000 UTC m=+37.309914373 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : secret "router-metrics-certs-default" not found Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.520653 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.520712 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls podName:bfdb87ca-0166-4b3f-83a8-f352f00ae0c6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:23.520698417 +0000 UTC m=+37.310046530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls") pod "dns-default-5wlnl" (UID: "bfdb87ca-0166-4b3f-83a8-f352f00ae0c6") : secret "dns-default-metrics-tls" not found Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.520717 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.520734 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:23.520724466 +0000 UTC m=+37.310072580 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:21.521135 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:21.520751 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert podName:ddb0d5c7-7432-473d-a4e0-7822ea15651e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:23.520739701 +0000 UTC m=+37.310087809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert") pod "ingress-canary-nnptv" (UID: "ddb0d5c7-7432-473d-a4e0-7822ea15651e") : secret "canary-serving-cert" not found Apr 16 16:24:22.137172 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:22.137056 2576 generic.go:358] "Generic (PLEG): container finished" podID="51f23044-6510-4235-820f-fdca93d4bab6" containerID="9e4b77b3330ab4418e4228f28c17f9f41d9c09646f057c58cad1fab60a11df97" exitCode=0 Apr 16 16:24:22.137172 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:22.137131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" event={"ID":"51f23044-6510-4235-820f-fdca93d4bab6","Type":"ContainerDied","Data":"9e4b77b3330ab4418e4228f28c17f9f41d9c09646f057c58cad1fab60a11df97"} Apr 16 16:24:23.148730 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:23.148670 2576 generic.go:358] "Generic (PLEG): container finished" podID="51f23044-6510-4235-820f-fdca93d4bab6" containerID="89a2f9b61a2cd478f911556a36f60aa84d8feadd130d1b6d09cd61949704f54d" exitCode=0 Apr 16 16:24:23.149367 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:23.148765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" event={"ID":"51f23044-6510-4235-820f-fdca93d4bab6","Type":"ContainerDied","Data":"89a2f9b61a2cd478f911556a36f60aa84d8feadd130d1b6d09cd61949704f54d"} Apr 16 16:24:23.338816 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:23.338773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:23.338975 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.338928 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:23.338975 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.338946 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86cd4845fd-xltjh: secret "image-registry-tls" not found Apr 16 16:24:23.339089 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.339011 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls podName:f15b3f4f-12e6-4033-8a06-87894739df95 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:27.338991597 +0000 UTC m=+41.128339714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls") pod "image-registry-86cd4845fd-xltjh" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95") : secret "image-registry-tls" not found Apr 16 16:24:23.439698 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:23.439608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:23.439845 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:23.439715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:23.439845 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.439772 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:23.439845 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.439842 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls podName:476e7e41-cee5-4dbf-bd5f-7d3a9ce62024 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:27.439822703 +0000 UTC m=+41.229170820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-2kdhn" (UID: "476e7e41-cee5-4dbf-bd5f-7d3a9ce62024") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:23.440012 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.439912 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:23.440012 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.439961 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls podName:4b370c20-7985-47c9-b2b1-e685ab180a6e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:27.439945427 +0000 UTC m=+41.229293541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls") pod "cluster-samples-operator-667775844f-vhnrp" (UID: "4b370c20-7985-47c9-b2b1-e685ab180a6e") : secret "samples-operator-tls" not found Apr 16 16:24:23.541139 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:23.540848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:23.541139 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:23.540927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:23.541139 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:23.540976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:23.541139 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:23.541002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:23.541139 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.541009 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:24:23.541139 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.541076 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert podName:ddb0d5c7-7432-473d-a4e0-7822ea15651e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:27.541056398 +0000 UTC m=+41.330404511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert") pod "ingress-canary-nnptv" (UID: "ddb0d5c7-7432-473d-a4e0-7822ea15651e") : secret "canary-serving-cert" not found Apr 16 16:24:23.541139 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.541120 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:24:23.541583 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.541170 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls podName:bfdb87ca-0166-4b3f-83a8-f352f00ae0c6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:27.541155497 +0000 UTC m=+41.330503601 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls") pod "dns-default-5wlnl" (UID: "bfdb87ca-0166-4b3f-83a8-f352f00ae0c6") : secret "dns-default-metrics-tls" not found Apr 16 16:24:23.541583 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.541120 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:23.541583 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.541187 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:27.541179102 +0000 UTC m=+41.330527204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:23.541583 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:23.541233 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:27.541216641 +0000 UTC m=+41.330564758 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : secret "router-metrics-certs-default" not found Apr 16 16:24:27.371232 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:27.371157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:27.371606 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.371320 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:27.371606 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.371341 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86cd4845fd-xltjh: secret "image-registry-tls" not found Apr 16 16:24:27.371606 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.371405 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls podName:f15b3f4f-12e6-4033-8a06-87894739df95 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:35.371384515 +0000 UTC m=+49.160732620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls") pod "image-registry-86cd4845fd-xltjh" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95") : secret "image-registry-tls" not found Apr 16 16:24:27.471939 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:27.471905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:27.472116 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:27.471982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:27.472180 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.472124 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:27.472241 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.472198 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls podName:4b370c20-7985-47c9-b2b1-e685ab180a6e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:35.472177906 +0000 UTC m=+49.261526020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls") pod "cluster-samples-operator-667775844f-vhnrp" (UID: "4b370c20-7985-47c9-b2b1-e685ab180a6e") : secret "samples-operator-tls" not found Apr 16 16:24:27.472440 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.472408 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:27.472577 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.472491 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls podName:476e7e41-cee5-4dbf-bd5f-7d3a9ce62024 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:35.472471585 +0000 UTC m=+49.261819699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-2kdhn" (UID: "476e7e41-cee5-4dbf-bd5f-7d3a9ce62024") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:27.572694 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:27.572650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:27.572828 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:27.572750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:27.572828 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.572779 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:24:27.572828 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:27.572799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:27.572976 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.572834 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert podName:ddb0d5c7-7432-473d-a4e0-7822ea15651e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:35.57281438 +0000 UTC m=+49.362162485 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert") pod "ingress-canary-nnptv" (UID: "ddb0d5c7-7432-473d-a4e0-7822ea15651e") : secret "canary-serving-cert" not found Apr 16 16:24:27.572976 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:27.572862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:27.572976 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.572882 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:24:27.572976 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.572915 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:35.572903357 +0000 UTC m=+49.362251465 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:27.572976 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.572933 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls podName:bfdb87ca-0166-4b3f-83a8-f352f00ae0c6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:35.572925176 +0000 UTC m=+49.362273283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls") pod "dns-default-5wlnl" (UID: "bfdb87ca-0166-4b3f-83a8-f352f00ae0c6") : secret "dns-default-metrics-tls" not found Apr 16 16:24:27.573237 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.573013 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:27.573237 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:27.573079 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:35.573061028 +0000 UTC m=+49.362409146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : secret "router-metrics-certs-default" not found Apr 16 16:24:30.167279 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.167246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" event={"ID":"51f23044-6510-4235-820f-fdca93d4bab6","Type":"ContainerStarted","Data":"d9b87cc437ecde220b41c8f7aa14c24f6e0e0d47fe5cbc880f0a79d9b0b0fc1b"} Apr 16 16:24:30.168927 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.168910 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/0.log" Apr 16 16:24:30.169042 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.168943 2576 generic.go:358] "Generic (PLEG): container finished" podID="bfa69e98-1a05-4478-9f56-e8ca514789be" containerID="2741fe8e415cf1529b23d3ee80a0629388f910b1489d1b88f4add80a7f3db74d" exitCode=255 Apr 16 16:24:30.169042 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.168972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" event={"ID":"bfa69e98-1a05-4478-9f56-e8ca514789be","Type":"ContainerDied","Data":"2741fe8e415cf1529b23d3ee80a0629388f910b1489d1b88f4add80a7f3db74d"} Apr 16 16:24:30.169218 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.169199 2576 scope.go:117] "RemoveContainer" containerID="2741fe8e415cf1529b23d3ee80a0629388f910b1489d1b88f4add80a7f3db74d" Apr 16 16:24:30.170806 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.170431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t98gk" event={"ID":"883d7c41-dae5-4d36-b39e-13485dde73de","Type":"ContainerStarted","Data":"209378ba6f6854767bb18137e6f7b6d6e4eb27d658d0707d87d6b6d69dccc4ba"} Apr 16 16:24:30.172237 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.172215 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mht57" event={"ID":"b4dd6fba-5a94-4e3e-92e2-74610c8a58bf","Type":"ContainerStarted","Data":"5b8c8782109d24f39893082100dabcbb560d3e7f25e11a3c722ee010c909c1f7"} Apr 16 16:24:30.173650 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.173630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" event={"ID":"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22","Type":"ContainerStarted","Data":"4c61951564f9e7d950b26ef16a06dc1a167dc6e34a2054cf4083336a3ed25482"} Apr 16 16:24:30.175094 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.175071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48" event={"ID":"ad1a58a4-dd90-4915-8c28-27977a5f2692","Type":"ContainerStarted","Data":"efc5bc3e3d5d087b3be8e21b0a189c97b7b37a980ca2bce81a5e7199c28dca4f"} Apr 16 16:24:30.176408 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.176378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" event={"ID":"2e60d773-ddaa-48ec-b63d-69179db32795","Type":"ContainerStarted","Data":"5db9221b16a7121eb76db4a2d1ebf46c803a02a05c8ed81412da83fb32791a3f"} Apr 16 16:24:30.178008 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.177986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" event={"ID":"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad","Type":"ContainerStarted","Data":"11121651f1522732196472e19256bde5eda5a6bdd51d9450a2548e82b1477600"} Apr 16 16:24:30.229575 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.229529 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" podStartSLOduration=20.730859315 podStartE2EDuration="29.22951313s" podCreationTimestamp="2026-04-16 16:24:01 +0000 UTC" firstStartedPulling="2026-04-16 16:24:20.952932389 +0000 UTC m=+34.742280501" lastFinishedPulling="2026-04-16 16:24:29.451586208 +0000 UTC m=+43.240934316" observedRunningTime="2026-04-16 16:24:30.228375852 +0000 UTC m=+44.017723976" watchObservedRunningTime="2026-04-16 16:24:30.22951313 +0000 UTC m=+44.018861259" Apr 16 16:24:30.230433 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.230392 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kpjzv" podStartSLOduration=10.357472268 podStartE2EDuration="43.230381318s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="2026-04-16 16:23:48.137197523 +0000 UTC m=+1.926545630" lastFinishedPulling="2026-04-16 16:24:21.010106572 +0000 UTC m=+34.799454680" observedRunningTime="2026-04-16 16:24:30.208365478 +0000 UTC m=+43.997713603" watchObservedRunningTime="2026-04-16 16:24:30.230381318 +0000 UTC m=+44.019729444" Apr 16 16:24:30.246911 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.246867 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" podStartSLOduration=20.773529917 podStartE2EDuration="29.246852608s" podCreationTimestamp="2026-04-16 16:24:01 +0000 UTC" firstStartedPulling="2026-04-16 16:24:20.955309847 +0000 UTC m=+34.744657957" lastFinishedPulling="2026-04-16 16:24:29.428632544 +0000 UTC m=+43.217980648" observedRunningTime="2026-04-16 16:24:30.246149665 +0000 UTC m=+44.035497815" watchObservedRunningTime="2026-04-16 16:24:30.246852608 +0000 UTC m=+44.036200734" Apr 16 16:24:30.264107 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.264059 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7ql48" podStartSLOduration=21.51886197 podStartE2EDuration="29.264045383s" podCreationTimestamp="2026-04-16 16:24:01 +0000 UTC" firstStartedPulling="2026-04-16 16:24:20.951801521 +0000 UTC m=+34.741149624" lastFinishedPulling="2026-04-16 16:24:28.69698493 +0000 UTC m=+42.486333037" observedRunningTime="2026-04-16 16:24:30.263464025 +0000 UTC m=+44.052812150" watchObservedRunningTime="2026-04-16 16:24:30.264045383 +0000 UTC m=+44.053393512" Apr 16 16:24:30.289787 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.289738 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-t98gk" podStartSLOduration=17.794291367 podStartE2EDuration="26.289722699s" podCreationTimestamp="2026-04-16 16:24:04 +0000 UTC" firstStartedPulling="2026-04-16 16:24:20.950625962 +0000 UTC m=+34.739974076" lastFinishedPulling="2026-04-16 16:24:29.446057304 +0000 UTC m=+43.235405408" observedRunningTime="2026-04-16 16:24:30.289452782 +0000 UTC m=+44.078800908" watchObservedRunningTime="2026-04-16 16:24:30.289722699 +0000 UTC m=+44.079070819" Apr 16 16:24:30.335897 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.335136 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" podStartSLOduration=20.895791732 podStartE2EDuration="29.335120079s" podCreationTimestamp="2026-04-16 16:24:01 +0000 UTC" firstStartedPulling="2026-04-16 16:24:20.989397781 +0000 UTC m=+34.778745899" lastFinishedPulling="2026-04-16 16:24:29.428726134 +0000 UTC m=+43.218074246" observedRunningTime="2026-04-16 16:24:30.334049075 +0000 UTC m=+44.123397200" watchObservedRunningTime="2026-04-16 16:24:30.335120079 +0000 UTC m=+44.124468207" Apr 16 16:24:30.354157 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.354115 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-mht57" podStartSLOduration=20.864375545 podStartE2EDuration="29.354100576s" podCreationTimestamp="2026-04-16 16:24:01 +0000 UTC" firstStartedPulling="2026-04-16 16:24:20.954451424 +0000 UTC m=+34.743799534" lastFinishedPulling="2026-04-16 16:24:29.444176456 +0000 UTC m=+43.233524565" observedRunningTime="2026-04-16 16:24:30.353798513 +0000 UTC m=+44.143146639" watchObservedRunningTime="2026-04-16 16:24:30.354100576 +0000 UTC m=+44.143448700" Apr 16 16:24:30.556783 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.556754 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w"] Apr 16 16:24:30.577926 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.576973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:30.580441 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.580217 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 16:24:30.580441 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.580271 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-x74q8\"" Apr 16 16:24:30.580441 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.580222 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 16:24:30.581880 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.581860 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w"] Apr 16 16:24:30.701610 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.701576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/daaa3a2c-4261-4fe7-8de0-322bed91df07-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:30.701610 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.701614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:30.802404 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.802376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/daaa3a2c-4261-4fe7-8de0-322bed91df07-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:30.802404 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.802406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:30.802639 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:30.802619 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:30.802724 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:30.802712 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert podName:daaa3a2c-4261-4fe7-8de0-322bed91df07 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:31.302671658 +0000 UTC m=+45.092019768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-lt76w" (UID: "daaa3a2c-4261-4fe7-8de0-322bed91df07") : secret "networking-console-plugin-cert" not found Apr 16 16:24:30.803146 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.803121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/daaa3a2c-4261-4fe7-8de0-322bed91df07-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:30.971726 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.971641 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx"] Apr 16 16:24:30.993781 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.993757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx"] Apr 16 16:24:30.993923 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.993893 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx" Apr 16 16:24:30.997012 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.996988 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 16:24:30.997135 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.996988 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:30.997135 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:30.996988 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-r969k\"" Apr 16 16:24:31.105195 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.105156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4g6b\" (UniqueName: \"kubernetes.io/projected/60f9dc94-90f1-4b02-9ea3-7169126c399f-kube-api-access-v4g6b\") pod \"migrator-64d4d94569-scfkx\" (UID: \"60f9dc94-90f1-4b02-9ea3-7169126c399f\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx" Apr 16 16:24:31.182561 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.182536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:24:31.182911 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.182900 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/0.log" Apr 16 16:24:31.182958 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.182929 2576 generic.go:358] "Generic (PLEG): container finished" podID="bfa69e98-1a05-4478-9f56-e8ca514789be" containerID="d2fe604d341f0cf82c20f18d95406bb7b80530c5edc7168a202075a28afc3061" exitCode=255 Apr 16 16:24:31.183095 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.183067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" event={"ID":"bfa69e98-1a05-4478-9f56-e8ca514789be","Type":"ContainerDied","Data":"d2fe604d341f0cf82c20f18d95406bb7b80530c5edc7168a202075a28afc3061"} Apr 16 16:24:31.183181 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.183129 2576 scope.go:117] "RemoveContainer" containerID="2741fe8e415cf1529b23d3ee80a0629388f910b1489d1b88f4add80a7f3db74d" Apr 16 16:24:31.183292 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.183272 2576 scope.go:117] "RemoveContainer" containerID="d2fe604d341f0cf82c20f18d95406bb7b80530c5edc7168a202075a28afc3061" Apr 16 16:24:31.183524 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:31.183501 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-6szqp_openshift-console-operator(bfa69e98-1a05-4478-9f56-e8ca514789be)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" podUID="bfa69e98-1a05-4478-9f56-e8ca514789be" Apr 16 16:24:31.206782 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.206763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4g6b\" (UniqueName: \"kubernetes.io/projected/60f9dc94-90f1-4b02-9ea3-7169126c399f-kube-api-access-v4g6b\") pod \"migrator-64d4d94569-scfkx\" (UID: \"60f9dc94-90f1-4b02-9ea3-7169126c399f\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx" Apr 16 16:24:31.219555 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.219531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4g6b\" (UniqueName: \"kubernetes.io/projected/60f9dc94-90f1-4b02-9ea3-7169126c399f-kube-api-access-v4g6b\") pod \"migrator-64d4d94569-scfkx\" (UID: \"60f9dc94-90f1-4b02-9ea3-7169126c399f\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx" Apr 16 16:24:31.304320 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.304294 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx" Apr 16 16:24:31.308188 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.308156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:31.308320 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:31.308297 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:31.308424 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:31.308358 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert podName:daaa3a2c-4261-4fe7-8de0-322bed91df07 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:32.308340913 +0000 UTC m=+46.097689016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-lt76w" (UID: "daaa3a2c-4261-4fe7-8de0-322bed91df07") : secret "networking-console-plugin-cert" not found Apr 16 16:24:31.418467 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:31.418442 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx"] Apr 16 16:24:31.421486 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:31.421462 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f9dc94_90f1_4b02_9ea3_7169126c399f.slice/crio-16a22290051c04ee5bcfe72719f41033644cfae1a7aba25c88caf029bef5280c WatchSource:0}: Error finding container 16a22290051c04ee5bcfe72719f41033644cfae1a7aba25c88caf029bef5280c: Status 404 returned error can't find the container with id 16a22290051c04ee5bcfe72719f41033644cfae1a7aba25c88caf029bef5280c Apr 16 16:24:32.188021 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:32.187990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:24:32.188442 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:32.188334 2576 scope.go:117] "RemoveContainer" containerID="d2fe604d341f0cf82c20f18d95406bb7b80530c5edc7168a202075a28afc3061" Apr 16 16:24:32.188562 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:32.188535 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-6szqp_openshift-console-operator(bfa69e98-1a05-4478-9f56-e8ca514789be)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" podUID="bfa69e98-1a05-4478-9f56-e8ca514789be" Apr 16 16:24:32.189218 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:32.189194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx" event={"ID":"60f9dc94-90f1-4b02-9ea3-7169126c399f","Type":"ContainerStarted","Data":"16a22290051c04ee5bcfe72719f41033644cfae1a7aba25c88caf029bef5280c"} Apr 16 16:24:32.317420 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:32.317392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:32.317845 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:32.317822 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:32.317950 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:32.317906 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert podName:daaa3a2c-4261-4fe7-8de0-322bed91df07 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:34.317877501 +0000 UTC m=+48.107225606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-lt76w" (UID: "daaa3a2c-4261-4fe7-8de0-322bed91df07") : secret "networking-console-plugin-cert" not found Apr 16 16:24:33.105470 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.105447 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tqwdh_8593e95d-58e6-43c3-99b0-5582e1e25f39/dns-node-resolver/0.log" Apr 16 16:24:33.149162 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.149142 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-q49b8"] Apr 16 16:24:33.176125 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.176104 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-q49b8"] Apr 16 16:24:33.176237 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.176229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.178847 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.178825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 16:24:33.180183 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.180163 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hz7h8\"" Apr 16 16:24:33.180183 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.180173 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 16:24:33.180329 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.180163 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 16:24:33.180410 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.180394 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 16:24:33.192888 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.192868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx" event={"ID":"60f9dc94-90f1-4b02-9ea3-7169126c399f","Type":"ContainerStarted","Data":"56ad8ad3e86fdde6e9d905eedf057f633c59cf6ebb16dd3c4b13889344116ff8"} Apr 16 16:24:33.193133 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.192894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx" event={"ID":"60f9dc94-90f1-4b02-9ea3-7169126c399f","Type":"ContainerStarted","Data":"58ce8176a114afd8633fb10d650b30ec4e2ad3284d54215a621572300e901604"} Apr 16 16:24:33.224741 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.224689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/38938164-8ed1-4df2-ae86-f370306eff4e-signing-key\") pod \"service-ca-bfc587fb7-q49b8\" (UID: \"38938164-8ed1-4df2-ae86-f370306eff4e\") " pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.224817 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.224750 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/38938164-8ed1-4df2-ae86-f370306eff4e-signing-cabundle\") pod \"service-ca-bfc587fb7-q49b8\" (UID: \"38938164-8ed1-4df2-ae86-f370306eff4e\") " pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.224817 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.224802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqvsn\" (UniqueName: \"kubernetes.io/projected/38938164-8ed1-4df2-ae86-f370306eff4e-kube-api-access-wqvsn\") pod \"service-ca-bfc587fb7-q49b8\" (UID: \"38938164-8ed1-4df2-ae86-f370306eff4e\") " pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.325451 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.325420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/38938164-8ed1-4df2-ae86-f370306eff4e-signing-cabundle\") pod \"service-ca-bfc587fb7-q49b8\" (UID: \"38938164-8ed1-4df2-ae86-f370306eff4e\") " pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.325555 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.325461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqvsn\" (UniqueName: \"kubernetes.io/projected/38938164-8ed1-4df2-ae86-f370306eff4e-kube-api-access-wqvsn\") pod \"service-ca-bfc587fb7-q49b8\" (UID: \"38938164-8ed1-4df2-ae86-f370306eff4e\") " pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.325596 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.325553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/38938164-8ed1-4df2-ae86-f370306eff4e-signing-key\") pod \"service-ca-bfc587fb7-q49b8\" (UID: \"38938164-8ed1-4df2-ae86-f370306eff4e\") " pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.326060 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.326043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/38938164-8ed1-4df2-ae86-f370306eff4e-signing-cabundle\") pod \"service-ca-bfc587fb7-q49b8\" (UID: \"38938164-8ed1-4df2-ae86-f370306eff4e\") " pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.327854 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.327837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/38938164-8ed1-4df2-ae86-f370306eff4e-signing-key\") pod \"service-ca-bfc587fb7-q49b8\" (UID: \"38938164-8ed1-4df2-ae86-f370306eff4e\") " pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.333690 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.333656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqvsn\" (UniqueName: \"kubernetes.io/projected/38938164-8ed1-4df2-ae86-f370306eff4e-kube-api-access-wqvsn\") pod \"service-ca-bfc587fb7-q49b8\" (UID: \"38938164-8ed1-4df2-ae86-f370306eff4e\") " pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.484839 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.484793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" Apr 16 16:24:33.609180 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.609135 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-scfkx" podStartSLOduration=2.083257141 podStartE2EDuration="3.609120086s" podCreationTimestamp="2026-04-16 16:24:30 +0000 UTC" firstStartedPulling="2026-04-16 16:24:31.423901265 +0000 UTC m=+45.213249368" lastFinishedPulling="2026-04-16 16:24:32.949764201 +0000 UTC m=+46.739112313" observedRunningTime="2026-04-16 16:24:33.217209696 +0000 UTC m=+47.006557820" watchObservedRunningTime="2026-04-16 16:24:33.609120086 +0000 UTC m=+47.398468211" Apr 16 16:24:33.609757 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:33.609735 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-q49b8"] Apr 16 16:24:33.612443 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:33.612407 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38938164_8ed1_4df2_ae86_f370306eff4e.slice/crio-a05e38b2671550eaadbb469b223137574e137ebd4ed19a2be9baae230c9fc1c6 WatchSource:0}: Error finding container a05e38b2671550eaadbb469b223137574e137ebd4ed19a2be9baae230c9fc1c6: Status 404 returned error can't find the container with id a05e38b2671550eaadbb469b223137574e137ebd4ed19a2be9baae230c9fc1c6 Apr 16 16:24:34.196762 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:34.196700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" event={"ID":"38938164-8ed1-4df2-ae86-f370306eff4e","Type":"ContainerStarted","Data":"322f9adbf14b3287c9f5f55c556813220e03c14ade80c3899e07f37a3e97ab1d"} Apr 16 16:24:34.196762 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:34.196732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" event={"ID":"38938164-8ed1-4df2-ae86-f370306eff4e","Type":"ContainerStarted","Data":"a05e38b2671550eaadbb469b223137574e137ebd4ed19a2be9baae230c9fc1c6"} Apr 16 16:24:34.214192 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:34.214035 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-q49b8" podStartSLOduration=1.214019349 podStartE2EDuration="1.214019349s" podCreationTimestamp="2026-04-16 16:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:34.213784947 +0000 UTC m=+48.003133070" watchObservedRunningTime="2026-04-16 16:24:34.214019349 +0000 UTC m=+48.003367476" Apr 16 16:24:34.306403 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:34.306381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zfglr_32a91c59-c74e-45df-ab79-de8449b1b1e3/node-ca/0.log" Apr 16 16:24:34.332780 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:34.332755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:34.332887 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:34.332852 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:34.332925 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:34.332902 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert podName:daaa3a2c-4261-4fe7-8de0-322bed91df07 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:38.3328876 +0000 UTC m=+52.122235703 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-lt76w" (UID: "daaa3a2c-4261-4fe7-8de0-322bed91df07") : secret "networking-console-plugin-cert" not found Apr 16 16:24:34.910016 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:34.909987 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-scfkx_60f9dc94-90f1-4b02-9ea3-7169126c399f/migrator/0.log" Apr 16 16:24:35.106522 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:35.106500 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-scfkx_60f9dc94-90f1-4b02-9ea3-7169126c399f/graceful-termination/0.log" Apr 16 16:24:35.307847 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:35.307825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-6rs86_7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22/kube-storage-version-migrator-operator/0.log" Apr 16 16:24:35.442272 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:35.442246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:35.446086 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.442728 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:24:35.446086 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.442751 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86cd4845fd-xltjh: secret "image-registry-tls" not found Apr 16 16:24:35.446086 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.442936 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls podName:f15b3f4f-12e6-4033-8a06-87894739df95 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:51.442918154 +0000 UTC m=+65.232266274 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls") pod "image-registry-86cd4845fd-xltjh" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95") : secret "image-registry-tls" not found Apr 16 16:24:35.543643 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:35.543607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:35.543796 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:35.543733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:35.543796 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.543757 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:35.543866 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.543829 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls podName:476e7e41-cee5-4dbf-bd5f-7d3a9ce62024 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:51.543813983 +0000 UTC m=+65.333162087 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-2kdhn" (UID: "476e7e41-cee5-4dbf-bd5f-7d3a9ce62024") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:35.543866 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.543854 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:24:35.543946 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.543911 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls podName:4b370c20-7985-47c9-b2b1-e685ab180a6e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:51.543895783 +0000 UTC m=+65.333243894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls") pod "cluster-samples-operator-667775844f-vhnrp" (UID: "4b370c20-7985-47c9-b2b1-e685ab180a6e") : secret "samples-operator-tls" not found Apr 16 16:24:35.644915 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:35.644856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:35.645034 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:35.644917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:35.645034 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:35.644944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:35.645034 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:35.644960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:35.645034 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.644993 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:24:35.645234 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.645043 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert podName:ddb0d5c7-7432-473d-a4e0-7822ea15651e nodeName:}" failed. No retries permitted until 2026-04-16 16:24:51.645027773 +0000 UTC m=+65.434375878 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert") pod "ingress-canary-nnptv" (UID: "ddb0d5c7-7432-473d-a4e0-7822ea15651e") : secret "canary-serving-cert" not found Apr 16 16:24:35.645234 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.645057 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:24:35.645234 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.645065 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:35.645234 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.645096 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:51.64508255 +0000 UTC m=+65.434430653 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:35.645234 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.645122 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs podName:22155b83-35e6-40ea-8b7e-1fc752b875eb nodeName:}" failed. No retries permitted until 2026-04-16 16:24:51.645111243 +0000 UTC m=+65.434459353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs") pod "router-default-5dc6546dd6-bm7n7" (UID: "22155b83-35e6-40ea-8b7e-1fc752b875eb") : secret "router-metrics-certs-default" not found Apr 16 16:24:35.645234 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:35.645133 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls podName:bfdb87ca-0166-4b3f-83a8-f352f00ae0c6 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:51.645127505 +0000 UTC m=+65.434475608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls") pod "dns-default-5wlnl" (UID: "bfdb87ca-0166-4b3f-83a8-f352f00ae0c6") : secret "dns-default-metrics-tls" not found Apr 16 16:24:38.366603 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:38.366562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:38.367154 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:38.366775 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:38.367154 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:38.366859 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert podName:daaa3a2c-4261-4fe7-8de0-322bed91df07 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:46.366837502 +0000 UTC m=+60.156185609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-lt76w" (UID: "daaa3a2c-4261-4fe7-8de0-322bed91df07") : secret "networking-console-plugin-cert" not found Apr 16 16:24:39.947909 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:39.947872 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:39.947909 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:39.947910 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:39.948300 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:39.948217 2576 scope.go:117] "RemoveContainer" containerID="d2fe604d341f0cf82c20f18d95406bb7b80530c5edc7168a202075a28afc3061" Apr 16 16:24:39.948373 ip-10-0-141-93 kubenswrapper[2576]: E0416 16:24:39.948356 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-6szqp_openshift-console-operator(bfa69e98-1a05-4478-9f56-e8ca514789be)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" podUID="bfa69e98-1a05-4478-9f56-e8ca514789be" Apr 16 16:24:45.128284 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:45.128260 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9ctsd" Apr 16 16:24:46.435085 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:46.435042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:46.437589 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:46.437565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/daaa3a2c-4261-4fe7-8de0-322bed91df07-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-lt76w\" (UID: \"daaa3a2c-4261-4fe7-8de0-322bed91df07\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:46.490801 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:46.490765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" Apr 16 16:24:46.607278 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:46.607243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w"] Apr 16 16:24:46.611211 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:46.611189 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaaa3a2c_4261_4fe7_8de0_322bed91df07.slice/crio-623508f2e4e7e2b43d89719e8d1f6bc527f006f47138ab3c0eac2f241a5d44a3 WatchSource:0}: Error finding container 623508f2e4e7e2b43d89719e8d1f6bc527f006f47138ab3c0eac2f241a5d44a3: Status 404 returned error can't find the container with id 623508f2e4e7e2b43d89719e8d1f6bc527f006f47138ab3c0eac2f241a5d44a3 Apr 16 16:24:47.232021 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:47.231982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" event={"ID":"daaa3a2c-4261-4fe7-8de0-322bed91df07","Type":"ContainerStarted","Data":"623508f2e4e7e2b43d89719e8d1f6bc527f006f47138ab3c0eac2f241a5d44a3"} Apr 16 16:24:48.236978 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:48.236045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" event={"ID":"daaa3a2c-4261-4fe7-8de0-322bed91df07","Type":"ContainerStarted","Data":"9c401fbcb648f29f42dea6ef3d571f18ef4e12486c9aa279a99437374ca452e9"} Apr 16 16:24:48.256225 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:48.256171 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-lt76w" podStartSLOduration=16.970521897 podStartE2EDuration="18.25615831s" podCreationTimestamp="2026-04-16 16:24:30 +0000 UTC" firstStartedPulling="2026-04-16 16:24:46.613223941 +0000 UTC m=+60.402572045" lastFinishedPulling="2026-04-16 16:24:47.898860352 +0000 UTC m=+61.688208458" observedRunningTime="2026-04-16 16:24:48.255011765 +0000 UTC m=+62.044359890" watchObservedRunningTime="2026-04-16 16:24:48.25615831 +0000 UTC m=+62.045506435" Apr 16 16:24:50.839399 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:50.839370 2576 scope.go:117] "RemoveContainer" containerID="d2fe604d341f0cf82c20f18d95406bb7b80530c5edc7168a202075a28afc3061" Apr 16 16:24:51.011788 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.011760 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-sds72"] Apr 16 16:24:51.015766 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.015749 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.022030 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.022006 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dpxrk\"" Apr 16 16:24:51.022124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.022011 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:24:51.022124 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.022043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:24:51.040524 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.040502 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sds72"] Apr 16 16:24:51.074526 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.074504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4aea228b-8508-454b-802b-1ec5cc8a6795-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.074616 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.074585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4aea228b-8508-454b-802b-1ec5cc8a6795-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.074663 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.074634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4aea228b-8508-454b-802b-1ec5cc8a6795-crio-socket\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.074736 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.074695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfw5k\" (UniqueName: \"kubernetes.io/projected/4aea228b-8508-454b-802b-1ec5cc8a6795-kube-api-access-hfw5k\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.074774 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.074752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4aea228b-8508-454b-802b-1ec5cc8a6795-data-volume\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.129464 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.129406 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz"] Apr 16 16:24:51.132294 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.132278 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm"] Apr 16 16:24:51.132429 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.132415 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" Apr 16 16:24:51.135620 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.135602 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:24:51.135745 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.135639 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l"] Apr 16 16:24:51.135745 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.135652 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 16:24:51.135745 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.135657 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:24:51.135904 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.135874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.135976 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.135961 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:24:51.135976 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.135969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-xshtk\"" Apr 16 16:24:51.138813 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.138664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 16:24:51.138889 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.138865 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 16:24:51.138960 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.138924 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 16:24:51.139151 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.139111 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 16:24:51.139388 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.139369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.141754 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.141737 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 16:24:51.161253 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.159312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz"] Apr 16 16:24:51.161253 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.159930 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm"] Apr 16 16:24:51.161813 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.161777 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l"] Apr 16 16:24:51.175283 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4aea228b-8508-454b-802b-1ec5cc8a6795-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.175364 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-ca\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.175364 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4aea228b-8508-454b-802b-1ec5cc8a6795-crio-socket\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.175471 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pnws\" (UniqueName: \"kubernetes.io/projected/0cc73e23-8268-4f8d-9ecb-386af6952eee-kube-api-access-7pnws\") pod \"managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz\" (UID: \"0cc73e23-8268-4f8d-9ecb-386af6952eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" Apr 16 16:24:51.175471 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9fe821b3-3e2c-409d-b216-37bd6ad414d5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.175663 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4aea228b-8508-454b-802b-1ec5cc8a6795-crio-socket\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.175663 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfw5k\" (UniqueName: \"kubernetes.io/projected/4aea228b-8508-454b-802b-1ec5cc8a6795-kube-api-access-hfw5k\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.175663 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqn52\" (UniqueName: \"kubernetes.io/projected/9fe821b3-3e2c-409d-b216-37bd6ad414d5-kube-api-access-rqn52\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.175663 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.175663 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4aea228b-8508-454b-802b-1ec5cc8a6795-data-volume\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.175663 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.175989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-hub\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.175989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175740 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0cc73e23-8268-4f8d-9ecb-386af6952eee-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz\" (UID: \"0cc73e23-8268-4f8d-9ecb-386af6952eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" Apr 16 16:24:51.175989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4aea228b-8508-454b-802b-1ec5cc8a6795-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.175989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.175822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4aea228b-8508-454b-802b-1ec5cc8a6795-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.176194 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.176051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4aea228b-8508-454b-802b-1ec5cc8a6795-data-volume\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.178447 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.178421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4aea228b-8508-454b-802b-1ec5cc8a6795-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.200044 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.200019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfw5k\" (UniqueName: \"kubernetes.io/projected/4aea228b-8508-454b-802b-1ec5cc8a6795-kube-api-access-hfw5k\") pod \"insights-runtime-extractor-sds72\" (UID: \"4aea228b-8508-454b-802b-1ec5cc8a6795\") " pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.245889 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.245870 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:24:51.245964 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.245913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" event={"ID":"bfa69e98-1a05-4478-9f56-e8ca514789be","Type":"ContainerStarted","Data":"f2dcfe5b2da1dc5fb17cdb99dfa2bb76a6d032681264db6e40f9df6bbfb4689d"} Apr 16 16:24:51.246205 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.246188 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:51.276611 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrctq\" (UniqueName: \"kubernetes.io/projected/8a7d254b-1670-4607-a1b4-9b05fe979713-kube-api-access-xrctq\") pod \"klusterlet-addon-workmgr-779c9c494-xgf5l\" (UID: \"8a7d254b-1670-4607-a1b4-9b05fe979713\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.276719 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a7d254b-1670-4607-a1b4-9b05fe979713-tmp\") pod \"klusterlet-addon-workmgr-779c9c494-xgf5l\" (UID: \"8a7d254b-1670-4607-a1b4-9b05fe979713\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.276719 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8a7d254b-1670-4607-a1b4-9b05fe979713-klusterlet-config\") pod \"klusterlet-addon-workmgr-779c9c494-xgf5l\" (UID: \"8a7d254b-1670-4607-a1b4-9b05fe979713\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.276799 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-ca\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.276845 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pnws\" (UniqueName: \"kubernetes.io/projected/0cc73e23-8268-4f8d-9ecb-386af6952eee-kube-api-access-7pnws\") pod \"managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz\" (UID: \"0cc73e23-8268-4f8d-9ecb-386af6952eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" Apr 16 16:24:51.276845 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9fe821b3-3e2c-409d-b216-37bd6ad414d5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.276959 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqn52\" (UniqueName: \"kubernetes.io/projected/9fe821b3-3e2c-409d-b216-37bd6ad414d5-kube-api-access-rqn52\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.276959 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.277082 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.277082 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.276997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-hub\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.277082 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.277024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0cc73e23-8268-4f8d-9ecb-386af6952eee-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz\" (UID: \"0cc73e23-8268-4f8d-9ecb-386af6952eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" Apr 16 16:24:51.277648 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.277621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9fe821b3-3e2c-409d-b216-37bd6ad414d5-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.279627 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.279603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-ca\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.279843 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.279824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.279965 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.279944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0cc73e23-8268-4f8d-9ecb-386af6952eee-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz\" (UID: \"0cc73e23-8268-4f8d-9ecb-386af6952eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" Apr 16 16:24:51.280081 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.280060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.280144 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.280133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9fe821b3-3e2c-409d-b216-37bd6ad414d5-hub\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.304432 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.304414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pnws\" (UniqueName: \"kubernetes.io/projected/0cc73e23-8268-4f8d-9ecb-386af6952eee-kube-api-access-7pnws\") pod \"managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz\" (UID: \"0cc73e23-8268-4f8d-9ecb-386af6952eee\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" Apr 16 16:24:51.305969 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.305951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqn52\" (UniqueName: \"kubernetes.io/projected/9fe821b3-3e2c-409d-b216-37bd6ad414d5-kube-api-access-rqn52\") pod \"cluster-proxy-proxy-agent-b676f6cfc-pf6lm\" (UID: \"9fe821b3-3e2c-409d-b216-37bd6ad414d5\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.323922 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.323901 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-sds72" Apr 16 16:24:51.350776 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.350740 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" podStartSLOduration=41.861236072 podStartE2EDuration="50.350727924s" podCreationTimestamp="2026-04-16 16:24:01 +0000 UTC" firstStartedPulling="2026-04-16 16:24:20.954607703 +0000 UTC m=+34.743955814" lastFinishedPulling="2026-04-16 16:24:29.444099553 +0000 UTC m=+43.233447666" observedRunningTime="2026-04-16 16:24:51.349850519 +0000 UTC m=+65.139198643" watchObservedRunningTime="2026-04-16 16:24:51.350727924 +0000 UTC m=+65.140076048" Apr 16 16:24:51.377813 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.377789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrctq\" (UniqueName: \"kubernetes.io/projected/8a7d254b-1670-4607-a1b4-9b05fe979713-kube-api-access-xrctq\") pod \"klusterlet-addon-workmgr-779c9c494-xgf5l\" (UID: \"8a7d254b-1670-4607-a1b4-9b05fe979713\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.377933 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.377830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a7d254b-1670-4607-a1b4-9b05fe979713-tmp\") pod \"klusterlet-addon-workmgr-779c9c494-xgf5l\" (UID: \"8a7d254b-1670-4607-a1b4-9b05fe979713\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.377933 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.377850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8a7d254b-1670-4607-a1b4-9b05fe979713-klusterlet-config\") pod \"klusterlet-addon-workmgr-779c9c494-xgf5l\" (UID: \"8a7d254b-1670-4607-a1b4-9b05fe979713\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.378483 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.378443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a7d254b-1670-4607-a1b4-9b05fe979713-tmp\") pod \"klusterlet-addon-workmgr-779c9c494-xgf5l\" (UID: \"8a7d254b-1670-4607-a1b4-9b05fe979713\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.381709 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.381614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8a7d254b-1670-4607-a1b4-9b05fe979713-klusterlet-config\") pod \"klusterlet-addon-workmgr-779c9c494-xgf5l\" (UID: \"8a7d254b-1670-4607-a1b4-9b05fe979713\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.392948 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.392928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrctq\" (UniqueName: \"kubernetes.io/projected/8a7d254b-1670-4607-a1b4-9b05fe979713-kube-api-access-xrctq\") pod \"klusterlet-addon-workmgr-779c9c494-xgf5l\" (UID: \"8a7d254b-1670-4607-a1b4-9b05fe979713\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.442699 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.442659 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-6szqp" Apr 16 16:24:51.448855 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.448833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" Apr 16 16:24:51.451237 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.451206 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-sds72"] Apr 16 16:24:51.455501 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:51.455474 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aea228b_8508_454b_802b_1ec5cc8a6795.slice/crio-81c2d805149a5c7e7097804ab9d447142e8079888e2be0e0ed33b23cd9195537 WatchSource:0}: Error finding container 81c2d805149a5c7e7097804ab9d447142e8079888e2be0e0ed33b23cd9195537: Status 404 returned error can't find the container with id 81c2d805149a5c7e7097804ab9d447142e8079888e2be0e0ed33b23cd9195537 Apr 16 16:24:51.460705 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.460685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" Apr 16 16:24:51.478853 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.478830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:51.479024 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.479006 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:24:51.481823 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.481805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"image-registry-86cd4845fd-xltjh\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:51.580593 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.579357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:51.580593 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.579513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:51.580593 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.579547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:51.582664 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.582614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b370c20-7985-47c9-b2b1-e685ab180a6e-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-vhnrp\" (UID: \"4b370c20-7985-47c9-b2b1-e685ab180a6e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:51.586777 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.585434 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:24:51.586777 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.585700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/476e7e41-cee5-4dbf-bd5f-7d3a9ce62024-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-2kdhn\" (UID: \"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:51.592964 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.592932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f605cef-2cd0-4480-b5a0-4bb58f196ac7-metrics-certs\") pod \"network-metrics-daemon-6dvl8\" (UID: \"6f605cef-2cd0-4480-b5a0-4bb58f196ac7\") " pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:51.666527 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.666458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz"] Apr 16 16:24:51.670016 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:51.669990 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cc73e23_8268_4f8d_9ecb_386af6952eee.slice/crio-f8b2618b997037a9af4fd432269dd3199efa37746514a05a6c193385b403730c WatchSource:0}: Error finding container f8b2618b997037a9af4fd432269dd3199efa37746514a05a6c193385b403730c: Status 404 returned error can't find the container with id f8b2618b997037a9af4fd432269dd3199efa37746514a05a6c193385b403730c Apr 16 16:24:51.672562 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.672541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f6w5w\"" Apr 16 16:24:51.675653 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.675637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6dvl8" Apr 16 16:24:51.680453 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.680427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:51.680539 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.680471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:51.680539 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.680491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:51.680658 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.680551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:51.681171 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.681152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22155b83-35e6-40ea-8b7e-1fc752b875eb-service-ca-bundle\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:51.683185 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.683167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ddb0d5c7-7432-473d-a4e0-7822ea15651e-cert\") pod \"ingress-canary-nnptv\" (UID: \"ddb0d5c7-7432-473d-a4e0-7822ea15651e\") " pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:51.683266 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.683193 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/22155b83-35e6-40ea-8b7e-1fc752b875eb-metrics-certs\") pod \"router-default-5dc6546dd6-bm7n7\" (UID: \"22155b83-35e6-40ea-8b7e-1fc752b875eb\") " pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:51.683923 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.683902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bfdb87ca-0166-4b3f-83a8-f352f00ae0c6-metrics-tls\") pod \"dns-default-5wlnl\" (UID: \"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6\") " pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:51.687847 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.687819 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm"] Apr 16 16:24:51.699437 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.699415 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l"] Apr 16 16:24:51.705131 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:51.705107 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fe821b3_3e2c_409d_b216_37bd6ad414d5.slice/crio-37a8feff93f349fcd9a7e05315ae5aba68915424a4e1ce174249ae6f20aa7ffc WatchSource:0}: Error finding container 37a8feff93f349fcd9a7e05315ae5aba68915424a4e1ce174249ae6f20aa7ffc: Status 404 returned error can't find the container with id 37a8feff93f349fcd9a7e05315ae5aba68915424a4e1ce174249ae6f20aa7ffc Apr 16 16:24:51.731327 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:51.731301 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a7d254b_1670_4607_a1b4_9b05fe979713.slice/crio-43b243ce347fb7533eb1491c1c107dde8eb57a29814f3a6afb9dd92437ae44a0 WatchSource:0}: Error finding container 43b243ce347fb7533eb1491c1c107dde8eb57a29814f3a6afb9dd92437ae44a0: Status 404 returned error can't find the container with id 43b243ce347fb7533eb1491c1c107dde8eb57a29814f3a6afb9dd92437ae44a0 Apr 16 16:24:51.753374 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.750653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nvqpx\"" Apr 16 16:24:51.759273 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.758093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:51.766388 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.761749 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-b8zw7\"" Apr 16 16:24:51.768446 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.767323 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" Apr 16 16:24:51.781631 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.781604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:51.790989 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.790966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkt6\" (UniqueName: \"kubernetes.io/projected/903cab10-206d-4fef-bebf-bbf8db046d19-kube-api-access-cfkt6\") pod \"network-check-target-t7shk\" (UID: \"903cab10-206d-4fef-bebf-bbf8db046d19\") " pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:51.824706 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.824667 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-lkv92\"" Apr 16 16:24:51.832512 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.832085 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" Apr 16 16:24:51.841868 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.841843 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-z4m9w\"" Apr 16 16:24:51.848352 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.848312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6dvl8"] Apr 16 16:24:51.848936 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.848907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:51.851386 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:51.851361 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f605cef_2cd0_4480_b5a0_4bb58f196ac7.slice/crio-d3dca96df65e629d61f1ae554a0cbe6cf6d90566929f8da6449c32e68ace9847 WatchSource:0}: Error finding container d3dca96df65e629d61f1ae554a0cbe6cf6d90566929f8da6449c32e68ace9847: Status 404 returned error can't find the container with id d3dca96df65e629d61f1ae554a0cbe6cf6d90566929f8da6449c32e68ace9847 Apr 16 16:24:51.869009 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.868978 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-psw85\"" Apr 16 16:24:51.874199 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.874090 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5wlnl" Apr 16 16:24:51.889409 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.889178 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hp8vh\"" Apr 16 16:24:51.899197 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.899141 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nnptv" Apr 16 16:24:51.929004 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.928932 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn"] Apr 16 16:24:51.934382 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:51.934324 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod476e7e41_cee5_4dbf_bd5f_7d3a9ce62024.slice/crio-952cb6d32fec2bc9b189a39b3ee959d9ec30247a8ab8badf57160974fb4c5b0a WatchSource:0}: Error finding container 952cb6d32fec2bc9b189a39b3ee959d9ec30247a8ab8badf57160974fb4c5b0a: Status 404 returned error can't find the container with id 952cb6d32fec2bc9b189a39b3ee959d9ec30247a8ab8badf57160974fb4c5b0a Apr 16 16:24:51.959551 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.959403 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w5tzs\"" Apr 16 16:24:51.962525 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.962309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:51.974006 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:51.973938 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86cd4845fd-xltjh"] Apr 16 16:24:51.986619 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:51.984521 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf15b3f4f_12e6_4033_8a06_87894739df95.slice/crio-896e0d7cd1086ddcc8f92f7cec480bcc742d5cdcbd8da33e9c0de16571a1af76 WatchSource:0}: Error finding container 896e0d7cd1086ddcc8f92f7cec480bcc742d5cdcbd8da33e9c0de16571a1af76: Status 404 returned error can't find the container with id 896e0d7cd1086ddcc8f92f7cec480bcc742d5cdcbd8da33e9c0de16571a1af76 Apr 16 16:24:52.011446 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.011372 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp"] Apr 16 16:24:52.071647 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:52.071599 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22155b83_35e6_40ea_8b7e_1fc752b875eb.slice/crio-09bdbeb5bf5813bdeac010f16fa65d173c8aab6aacd1783e791b59277d541130 WatchSource:0}: Error finding container 09bdbeb5bf5813bdeac010f16fa65d173c8aab6aacd1783e791b59277d541130: Status 404 returned error can't find the container with id 09bdbeb5bf5813bdeac010f16fa65d173c8aab6aacd1783e791b59277d541130 Apr 16 16:24:52.072434 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.072409 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5dc6546dd6-bm7n7"] Apr 16 16:24:52.114457 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.114374 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5wlnl"] Apr 16 16:24:52.123360 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.123338 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nnptv"] Apr 16 16:24:52.127485 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:52.127458 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfdb87ca_0166_4b3f_83a8_f352f00ae0c6.slice/crio-3fa82b6c94fc6e8f03815af0f4910e4b14648bb03be1aa7920b9c3d6fc8ce994 WatchSource:0}: Error finding container 3fa82b6c94fc6e8f03815af0f4910e4b14648bb03be1aa7920b9c3d6fc8ce994: Status 404 returned error can't find the container with id 3fa82b6c94fc6e8f03815af0f4910e4b14648bb03be1aa7920b9c3d6fc8ce994 Apr 16 16:24:52.128059 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:52.128040 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddb0d5c7_7432_473d_a4e0_7822ea15651e.slice/crio-c7f0be0718554d76fa5c10e1fd9d1253a2094f1b7baa55823859008fd4509ac5 WatchSource:0}: Error finding container c7f0be0718554d76fa5c10e1fd9d1253a2094f1b7baa55823859008fd4509ac5: Status 404 returned error can't find the container with id c7f0be0718554d76fa5c10e1fd9d1253a2094f1b7baa55823859008fd4509ac5 Apr 16 16:24:52.155532 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.155505 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t7shk"] Apr 16 16:24:52.158634 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:24:52.158608 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod903cab10_206d_4fef_bebf_bbf8db046d19.slice/crio-a4067138c7ecc7720857172f422aa1f46c1c434bd5b89e1680b913a0be5a438d WatchSource:0}: Error finding container a4067138c7ecc7720857172f422aa1f46c1c434bd5b89e1680b913a0be5a438d: Status 404 returned error can't find the container with id a4067138c7ecc7720857172f422aa1f46c1c434bd5b89e1680b913a0be5a438d Apr 16 16:24:52.259115 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.259082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" event={"ID":"0cc73e23-8268-4f8d-9ecb-386af6952eee","Type":"ContainerStarted","Data":"f8b2618b997037a9af4fd432269dd3199efa37746514a05a6c193385b403730c"} Apr 16 16:24:52.262159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.262131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" event={"ID":"22155b83-35e6-40ea-8b7e-1fc752b875eb","Type":"ContainerStarted","Data":"d4de24fc93c421da7f81c23ee7416818b39a5073c87fe73c3f8f6c20932f9745"} Apr 16 16:24:52.262331 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.262312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" event={"ID":"22155b83-35e6-40ea-8b7e-1fc752b875eb","Type":"ContainerStarted","Data":"09bdbeb5bf5813bdeac010f16fa65d173c8aab6aacd1783e791b59277d541130"} Apr 16 16:24:52.264513 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.264492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" event={"ID":"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024","Type":"ContainerStarted","Data":"952cb6d32fec2bc9b189a39b3ee959d9ec30247a8ab8badf57160974fb4c5b0a"} Apr 16 16:24:52.266263 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.266238 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" event={"ID":"f15b3f4f-12e6-4033-8a06-87894739df95","Type":"ContainerStarted","Data":"cbaefe1ee802ca94a92f630aadd54c0e2553d4933a12f9cae6b26211dbbf0262"} Apr 16 16:24:52.267052 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.267032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" event={"ID":"f15b3f4f-12e6-4033-8a06-87894739df95","Type":"ContainerStarted","Data":"896e0d7cd1086ddcc8f92f7cec480bcc742d5cdcbd8da33e9c0de16571a1af76"} Apr 16 16:24:52.267154 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.267060 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:24:52.269179 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.269135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6dvl8" event={"ID":"6f605cef-2cd0-4480-b5a0-4bb58f196ac7","Type":"ContainerStarted","Data":"d3dca96df65e629d61f1ae554a0cbe6cf6d90566929f8da6449c32e68ace9847"} Apr 16 16:24:52.270319 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.270284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" event={"ID":"4b370c20-7985-47c9-b2b1-e685ab180a6e","Type":"ContainerStarted","Data":"16f72d94380b9975c2f1c7bb860db95ca73bc5feac18f003953a8279c8c56429"} Apr 16 16:24:52.271398 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.271372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" event={"ID":"8a7d254b-1670-4607-a1b4-9b05fe979713","Type":"ContainerStarted","Data":"43b243ce347fb7533eb1491c1c107dde8eb57a29814f3a6afb9dd92437ae44a0"} Apr 16 16:24:52.272916 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.272897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t7shk" event={"ID":"903cab10-206d-4fef-bebf-bbf8db046d19","Type":"ContainerStarted","Data":"a4067138c7ecc7720857172f422aa1f46c1c434bd5b89e1680b913a0be5a438d"} Apr 16 16:24:52.274374 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.274352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nnptv" event={"ID":"ddb0d5c7-7432-473d-a4e0-7822ea15651e","Type":"ContainerStarted","Data":"c7f0be0718554d76fa5c10e1fd9d1253a2094f1b7baa55823859008fd4509ac5"} Apr 16 16:24:52.276003 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.275980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sds72" event={"ID":"4aea228b-8508-454b-802b-1ec5cc8a6795","Type":"ContainerStarted","Data":"518f5e77c1f4048e47e83ff88b435dd73942b31dbf1b12cb9e542256e4c29582"} Apr 16 16:24:52.276112 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.276018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sds72" event={"ID":"4aea228b-8508-454b-802b-1ec5cc8a6795","Type":"ContainerStarted","Data":"81c2d805149a5c7e7097804ab9d447142e8079888e2be0e0ed33b23cd9195537"} Apr 16 16:24:52.277345 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.277323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5wlnl" event={"ID":"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6","Type":"ContainerStarted","Data":"3fa82b6c94fc6e8f03815af0f4910e4b14648bb03be1aa7920b9c3d6fc8ce994"} Apr 16 16:24:52.278660 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.278634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" event={"ID":"9fe821b3-3e2c-409d-b216-37bd6ad414d5","Type":"ContainerStarted","Data":"37a8feff93f349fcd9a7e05315ae5aba68915424a4e1ce174249ae6f20aa7ffc"} Apr 16 16:24:52.312933 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.312887 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" podStartSLOduration=51.31287574 podStartE2EDuration="51.31287574s" podCreationTimestamp="2026-04-16 16:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:52.311571552 +0000 UTC m=+66.100919676" watchObservedRunningTime="2026-04-16 16:24:52.31287574 +0000 UTC m=+66.102223865" Apr 16 16:24:52.850031 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.849940 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:52.865126 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.864922 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:52.901636 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:52.901586 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" podStartSLOduration=65.901567087 podStartE2EDuration="1m5.901567087s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:52.335765581 +0000 UTC m=+66.125113706" watchObservedRunningTime="2026-04-16 16:24:52.901567087 +0000 UTC m=+66.690915215" Apr 16 16:24:53.295821 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:53.295019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t7shk" event={"ID":"903cab10-206d-4fef-bebf-bbf8db046d19","Type":"ContainerStarted","Data":"2b91747c3d7ab02f5732f928acef61b9ec4e3b030cdfcf7c2a2f441eb8de562c"} Apr 16 16:24:53.295821 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:53.295783 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:24:53.313041 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:53.311720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sds72" event={"ID":"4aea228b-8508-454b-802b-1ec5cc8a6795","Type":"ContainerStarted","Data":"5fb626a4da3c609abbf8c7d2400c9c7004d347aaeff5a2a43f9badf42e800934"} Apr 16 16:24:53.313584 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:53.313545 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:53.318294 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:53.318268 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5dc6546dd6-bm7n7" Apr 16 16:24:53.397877 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:24:53.397501 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t7shk" podStartSLOduration=66.397483472 podStartE2EDuration="1m6.397483472s" podCreationTimestamp="2026-04-16 16:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:53.361185055 +0000 UTC m=+67.150533193" watchObservedRunningTime="2026-04-16 16:24:53.397483472 +0000 UTC m=+67.186831598" Apr 16 16:25:03.357329 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.357293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5wlnl" event={"ID":"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6","Type":"ContainerStarted","Data":"065b9f86cc9b1d406b2f3dbc820dca97416fbe49705ec5358e942f714921f0f5"} Apr 16 16:25:03.357329 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.357332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5wlnl" event={"ID":"bfdb87ca-0166-4b3f-83a8-f352f00ae0c6","Type":"ContainerStarted","Data":"f033943760aaff5dc0b3b4af42611064c6f06cb896e4479191316df8a7d5cf7f"} Apr 16 16:25:03.357875 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.357542 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5wlnl" Apr 16 16:25:03.359372 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.359335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" event={"ID":"9fe821b3-3e2c-409d-b216-37bd6ad414d5","Type":"ContainerStarted","Data":"5889f084f2f7e6f1045c5d6bd94b39ba18060f8d9072de29c3fe0fc18e14428d"} Apr 16 16:25:03.360927 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.360902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" event={"ID":"0cc73e23-8268-4f8d-9ecb-386af6952eee","Type":"ContainerStarted","Data":"9ef158a2d3490e81fd141e214ed7fca005f358e09350f581cb21daaffecf547e"} Apr 16 16:25:03.362424 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.362396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" event={"ID":"476e7e41-cee5-4dbf-bd5f-7d3a9ce62024","Type":"ContainerStarted","Data":"785720b4891959f7068729c8b056205d9177f85ccaacabdddacca125cbf64888"} Apr 16 16:25:03.363888 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.363868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6dvl8" event={"ID":"6f605cef-2cd0-4480-b5a0-4bb58f196ac7","Type":"ContainerStarted","Data":"a97360edaf05d5d5cb28bc6438a48d6240a3f5d0e9607084b250d38fd3df3f36"} Apr 16 16:25:03.364003 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.363894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6dvl8" event={"ID":"6f605cef-2cd0-4480-b5a0-4bb58f196ac7","Type":"ContainerStarted","Data":"65ffad8092043ac7ad534e6a083994495fc8d970812856a0c086d2811ad55f3b"} Apr 16 16:25:03.365522 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.365501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" event={"ID":"4b370c20-7985-47c9-b2b1-e685ab180a6e","Type":"ContainerStarted","Data":"c56e2f6b6c8a23fc7535aabde6afcafd572e97df939d0180df4ec595d8e6dbea"} Apr 16 16:25:03.365651 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.365529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" event={"ID":"4b370c20-7985-47c9-b2b1-e685ab180a6e","Type":"ContainerStarted","Data":"4c97b5af343abe2781ff85edfc5e9507d851dad542fb7ebf7e8cbd8cfc61168b"} Apr 16 16:25:03.367332 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.367294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" event={"ID":"8a7d254b-1670-4607-a1b4-9b05fe979713","Type":"ContainerStarted","Data":"4d1e021fd3ffc1cb143688e28e0916d37a7e71a344f7602fb0947cc89cb9737a"} Apr 16 16:25:03.367970 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.367941 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:25:03.369498 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.369466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nnptv" event={"ID":"ddb0d5c7-7432-473d-a4e0-7822ea15651e","Type":"ContainerStarted","Data":"ec1c19816b82bf0f39f59a75823d5a9617b9820076d94de4e225a5d89f57bd93"} Apr 16 16:25:03.370308 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.370292 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" Apr 16 16:25:03.371615 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.371597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-sds72" event={"ID":"4aea228b-8508-454b-802b-1ec5cc8a6795","Type":"ContainerStarted","Data":"a44c4018bafbd8965232d911091aa87512f58c46b58770c62a2e038452c2d15b"} Apr 16 16:25:03.391305 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.391258 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5wlnl" podStartSLOduration=33.717277103 podStartE2EDuration="44.391246922s" podCreationTimestamp="2026-04-16 16:24:19 +0000 UTC" firstStartedPulling="2026-04-16 16:24:52.129505137 +0000 UTC m=+65.918853239" lastFinishedPulling="2026-04-16 16:25:02.803474943 +0000 UTC m=+76.592823058" observedRunningTime="2026-04-16 16:25:03.390345767 +0000 UTC m=+77.179693893" watchObservedRunningTime="2026-04-16 16:25:03.391246922 +0000 UTC m=+77.180595053" Apr 16 16:25:03.481243 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.481147 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cb45bfb5c-bcglz" podStartSLOduration=1.3421024830000001 podStartE2EDuration="12.481128803s" podCreationTimestamp="2026-04-16 16:24:51 +0000 UTC" firstStartedPulling="2026-04-16 16:24:51.672173657 +0000 UTC m=+65.461521759" lastFinishedPulling="2026-04-16 16:25:02.811199971 +0000 UTC m=+76.600548079" observedRunningTime="2026-04-16 16:25:03.435761739 +0000 UTC m=+77.225109863" watchObservedRunningTime="2026-04-16 16:25:03.481128803 +0000 UTC m=+77.270476930" Apr 16 16:25:03.482126 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.482087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-2kdhn" podStartSLOduration=51.617719478 podStartE2EDuration="1m2.482077624s" podCreationTimestamp="2026-04-16 16:24:01 +0000 UTC" firstStartedPulling="2026-04-16 16:24:51.939150304 +0000 UTC m=+65.728498421" lastFinishedPulling="2026-04-16 16:25:02.80350845 +0000 UTC m=+76.592856567" observedRunningTime="2026-04-16 16:25:03.475911289 +0000 UTC m=+77.265259415" watchObservedRunningTime="2026-04-16 16:25:03.482077624 +0000 UTC m=+77.271425797" Apr 16 16:25:03.548802 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.548755 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-vhnrp" podStartSLOduration=51.843609944 podStartE2EDuration="1m2.54874272s" podCreationTimestamp="2026-04-16 16:24:01 +0000 UTC" firstStartedPulling="2026-04-16 16:24:52.098951035 +0000 UTC m=+65.888299137" lastFinishedPulling="2026-04-16 16:25:02.804083805 +0000 UTC m=+76.593431913" observedRunningTime="2026-04-16 16:25:03.514311708 +0000 UTC m=+77.303659844" watchObservedRunningTime="2026-04-16 16:25:03.54874272 +0000 UTC m=+77.338090844" Apr 16 16:25:03.549037 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.549008 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6dvl8" podStartSLOduration=66.760919214 podStartE2EDuration="1m17.548999568s" podCreationTimestamp="2026-04-16 16:23:46 +0000 UTC" firstStartedPulling="2026-04-16 16:24:51.86175536 +0000 UTC m=+65.651103479" lastFinishedPulling="2026-04-16 16:25:02.649835727 +0000 UTC m=+76.439183833" observedRunningTime="2026-04-16 16:25:03.545495247 +0000 UTC m=+77.334843372" watchObservedRunningTime="2026-04-16 16:25:03.548999568 +0000 UTC m=+77.338347693" Apr 16 16:25:03.569304 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.569264 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nnptv" podStartSLOduration=33.906140164 podStartE2EDuration="44.56925195s" podCreationTimestamp="2026-04-16 16:24:19 +0000 UTC" firstStartedPulling="2026-04-16 16:24:52.139048887 +0000 UTC m=+65.928396990" lastFinishedPulling="2026-04-16 16:25:02.802160654 +0000 UTC m=+76.591508776" observedRunningTime="2026-04-16 16:25:03.567490292 +0000 UTC m=+77.356838416" watchObservedRunningTime="2026-04-16 16:25:03.56925195 +0000 UTC m=+77.358600075" Apr 16 16:25:03.604762 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.604705 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-sds72" podStartSLOduration=2.423446321 podStartE2EDuration="13.60468759s" podCreationTimestamp="2026-04-16 16:24:50 +0000 UTC" firstStartedPulling="2026-04-16 16:24:51.622223424 +0000 UTC m=+65.411571530" lastFinishedPulling="2026-04-16 16:25:02.803464685 +0000 UTC m=+76.592812799" observedRunningTime="2026-04-16 16:25:03.60303689 +0000 UTC m=+77.392385013" watchObservedRunningTime="2026-04-16 16:25:03.60468759 +0000 UTC m=+77.394035707" Apr 16 16:25:03.635520 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:03.635470 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-779c9c494-xgf5l" podStartSLOduration=1.564653495 podStartE2EDuration="12.635453971s" podCreationTimestamp="2026-04-16 16:24:51 +0000 UTC" firstStartedPulling="2026-04-16 16:24:51.733127056 +0000 UTC m=+65.522475159" lastFinishedPulling="2026-04-16 16:25:02.803927527 +0000 UTC m=+76.593275635" observedRunningTime="2026-04-16 16:25:03.63389099 +0000 UTC m=+77.423239115" watchObservedRunningTime="2026-04-16 16:25:03.635453971 +0000 UTC m=+77.424802098" Apr 16 16:25:06.384269 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:06.384224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" event={"ID":"9fe821b3-3e2c-409d-b216-37bd6ad414d5","Type":"ContainerStarted","Data":"19827e5bf9cff10fc72d48524948a14d208a2da8e528ec7b086c10ebeba1ce9f"} Apr 16 16:25:06.384743 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:06.384276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" event={"ID":"9fe821b3-3e2c-409d-b216-37bd6ad414d5","Type":"ContainerStarted","Data":"1f29ae9a4e7da57a9c92ebccfc20ea2b165f348afa613dfc721afe221bdc0526"} Apr 16 16:25:06.413559 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:06.413510 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-b676f6cfc-pf6lm" podStartSLOduration=1.283160818 podStartE2EDuration="15.413495758s" podCreationTimestamp="2026-04-16 16:24:51 +0000 UTC" firstStartedPulling="2026-04-16 16:24:51.707175061 +0000 UTC m=+65.496523181" lastFinishedPulling="2026-04-16 16:25:05.837510005 +0000 UTC m=+79.626858121" observedRunningTime="2026-04-16 16:25:06.411136354 +0000 UTC m=+80.200484479" watchObservedRunningTime="2026-04-16 16:25:06.413495758 +0000 UTC m=+80.202843883" Apr 16 16:25:11.762774 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:11.762727 2576 patch_prober.go:28] interesting pod/image-registry-86cd4845fd-xltjh container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:25:11.763159 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:11.762790 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" podUID="f15b3f4f-12e6-4033-8a06-87894739df95" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:25:13.317316 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:13.317289 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:25:13.377961 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:13.377937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5wlnl" Apr 16 16:25:14.255895 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:14.255855 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86cd4845fd-xltjh"] Apr 16 16:25:16.357995 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.357959 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pbj7p"] Apr 16 16:25:16.361239 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.361223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.365740 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.365715 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:25:16.367487 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.367465 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:25:16.368952 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.368925 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vdxn6\"" Apr 16 16:25:16.369054 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.368965 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:25:16.370527 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.370511 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:25:16.475070 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.475047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-sys\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.475183 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.475081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-wtmp\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.475183 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.475142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-root\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.475183 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.475170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-textfile\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.475339 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.475193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.475339 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.475272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-tls\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.475339 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.475323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-metrics-client-ca\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.475486 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.475375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-accelerators-collector-config\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.475486 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.475431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx8q8\" (UniqueName: \"kubernetes.io/projected/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-kube-api-access-bx8q8\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.576818 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.576792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-accelerators-collector-config\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.576950 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.576843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx8q8\" (UniqueName: \"kubernetes.io/projected/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-kube-api-access-bx8q8\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.576950 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.576878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-sys\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.576950 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.576914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-wtmp\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.576950 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.576941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-root\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577141 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.576989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-textfile\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577141 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.577020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-root\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577141 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.577036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577141 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.577000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-sys\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577141 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.577076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-tls\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577141 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.577113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-wtmp\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577418 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.577233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-metrics-client-ca\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577418 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.577319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-textfile\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577758 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.577719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-accelerators-collector-config\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.577866 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.577802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-metrics-client-ca\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.579570 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.579552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-tls\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.579668 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.579647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.617580 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.617534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx8q8\" (UniqueName: \"kubernetes.io/projected/98e2c059-ba06-4cd2-9a3a-a1db3da6abe8-kube-api-access-bx8q8\") pod \"node-exporter-pbj7p\" (UID: \"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8\") " pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.671252 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:16.671236 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pbj7p" Apr 16 16:25:16.683114 ip-10-0-141-93 kubenswrapper[2576]: W0416 16:25:16.683094 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98e2c059_ba06_4cd2_9a3a_a1db3da6abe8.slice/crio-6ef67cdc1f9ea67cff5c86d9709bc8f2bb064502aee0b4d3b663bb182baa214b WatchSource:0}: Error finding container 6ef67cdc1f9ea67cff5c86d9709bc8f2bb064502aee0b4d3b663bb182baa214b: Status 404 returned error can't find the container with id 6ef67cdc1f9ea67cff5c86d9709bc8f2bb064502aee0b4d3b663bb182baa214b Apr 16 16:25:17.422822 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:17.422785 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pbj7p" event={"ID":"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8","Type":"ContainerStarted","Data":"6ef67cdc1f9ea67cff5c86d9709bc8f2bb064502aee0b4d3b663bb182baa214b"} Apr 16 16:25:18.426814 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:18.426778 2576 generic.go:358] "Generic (PLEG): container finished" podID="98e2c059-ba06-4cd2-9a3a-a1db3da6abe8" containerID="8d20ea8592e659c48ac37f6bd7026a54d34ca4424a4522941f163874f68ed8db" exitCode=0 Apr 16 16:25:18.427208 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:18.426867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pbj7p" event={"ID":"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8","Type":"ContainerDied","Data":"8d20ea8592e659c48ac37f6bd7026a54d34ca4424a4522941f163874f68ed8db"} Apr 16 16:25:19.432831 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:19.432776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pbj7p" event={"ID":"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8","Type":"ContainerStarted","Data":"be7cbee50bce9f8eb198d2607c182c3b2d9deb132819a5b94d0941c7c8afbe2c"} Apr 16 16:25:19.433223 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:19.432838 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pbj7p" event={"ID":"98e2c059-ba06-4cd2-9a3a-a1db3da6abe8","Type":"ContainerStarted","Data":"2c6201d7827850ac621bc5157d84a0074babc009790c4ad7edf55e160c5ee310"} Apr 16 16:25:25.330170 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:25.330133 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t7shk" Apr 16 16:25:25.350905 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:25.350859 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pbj7p" podStartSLOduration=8.614360398 podStartE2EDuration="9.350844084s" podCreationTimestamp="2026-04-16 16:25:16 +0000 UTC" firstStartedPulling="2026-04-16 16:25:16.685061431 +0000 UTC m=+90.474409538" lastFinishedPulling="2026-04-16 16:25:17.421545114 +0000 UTC m=+91.210893224" observedRunningTime="2026-04-16 16:25:19.492407276 +0000 UTC m=+93.281755400" watchObservedRunningTime="2026-04-16 16:25:25.350844084 +0000 UTC m=+99.140192209" Apr 16 16:25:39.274461 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.274418 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" podUID="f15b3f4f-12e6-4033-8a06-87894739df95" containerName="registry" containerID="cri-o://cbaefe1ee802ca94a92f630aadd54c0e2553d4933a12f9cae6b26211dbbf0262" gracePeriod=30 Apr 16 16:25:39.492745 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.492716 2576 generic.go:358] "Generic (PLEG): container finished" podID="f15b3f4f-12e6-4033-8a06-87894739df95" containerID="cbaefe1ee802ca94a92f630aadd54c0e2553d4933a12f9cae6b26211dbbf0262" exitCode=0 Apr 16 16:25:39.492866 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.492773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" event={"ID":"f15b3f4f-12e6-4033-8a06-87894739df95","Type":"ContainerDied","Data":"cbaefe1ee802ca94a92f630aadd54c0e2553d4933a12f9cae6b26211dbbf0262"} Apr 16 16:25:39.519205 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.519186 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:25:39.543256 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.543168 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-image-registry-private-configuration\") pod \"f15b3f4f-12e6-4033-8a06-87894739df95\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " Apr 16 16:25:39.543256 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.543202 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z65k\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-kube-api-access-6z65k\") pod \"f15b3f4f-12e6-4033-8a06-87894739df95\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " Apr 16 16:25:39.543256 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.543229 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-registry-certificates\") pod \"f15b3f4f-12e6-4033-8a06-87894739df95\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " Apr 16 16:25:39.543477 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.543271 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-bound-sa-token\") pod \"f15b3f4f-12e6-4033-8a06-87894739df95\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " Apr 16 16:25:39.543477 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.543294 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-installation-pull-secrets\") pod \"f15b3f4f-12e6-4033-8a06-87894739df95\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " Apr 16 16:25:39.543624 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.543342 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") pod \"f15b3f4f-12e6-4033-8a06-87894739df95\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " Apr 16 16:25:39.543790 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.543770 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f15b3f4f-12e6-4033-8a06-87894739df95-ca-trust-extracted\") pod \"f15b3f4f-12e6-4033-8a06-87894739df95\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " Apr 16 16:25:39.543901 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.543888 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-trusted-ca\") pod \"f15b3f4f-12e6-4033-8a06-87894739df95\" (UID: \"f15b3f4f-12e6-4033-8a06-87894739df95\") " Apr 16 16:25:39.544553 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.544380 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f15b3f4f-12e6-4033-8a06-87894739df95" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:39.546844 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.546812 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f15b3f4f-12e6-4033-8a06-87894739df95" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:39.548949 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.547049 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f15b3f4f-12e6-4033-8a06-87894739df95" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:39.548949 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.547393 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f15b3f4f-12e6-4033-8a06-87894739df95" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:39.548949 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.547759 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f15b3f4f-12e6-4033-8a06-87894739df95" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:39.548949 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.547946 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f15b3f4f-12e6-4033-8a06-87894739df95" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:39.552786 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.551645 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-kube-api-access-6z65k" (OuterVolumeSpecName: "kube-api-access-6z65k") pod "f15b3f4f-12e6-4033-8a06-87894739df95" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95"). InnerVolumeSpecName "kube-api-access-6z65k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:39.558607 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.558581 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f15b3f4f-12e6-4033-8a06-87894739df95-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f15b3f4f-12e6-4033-8a06-87894739df95" (UID: "f15b3f4f-12e6-4033-8a06-87894739df95"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:25:39.645070 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.645048 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-registry-tls\") on node \"ip-10-0-141-93.ec2.internal\" DevicePath \"\"" Apr 16 16:25:39.645070 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.645070 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f15b3f4f-12e6-4033-8a06-87894739df95-ca-trust-extracted\") on node \"ip-10-0-141-93.ec2.internal\" DevicePath \"\"" Apr 16 16:25:39.645187 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.645080 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-trusted-ca\") on node \"ip-10-0-141-93.ec2.internal\" DevicePath \"\"" Apr 16 16:25:39.645187 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.645090 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-image-registry-private-configuration\") on node \"ip-10-0-141-93.ec2.internal\" DevicePath \"\"" Apr 16 16:25:39.645187 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.645100 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6z65k\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-kube-api-access-6z65k\") on node \"ip-10-0-141-93.ec2.internal\" DevicePath \"\"" Apr 16 16:25:39.645187 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.645109 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f15b3f4f-12e6-4033-8a06-87894739df95-registry-certificates\") on node \"ip-10-0-141-93.ec2.internal\" DevicePath \"\"" Apr 16 16:25:39.645187 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.645118 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f15b3f4f-12e6-4033-8a06-87894739df95-bound-sa-token\") on node \"ip-10-0-141-93.ec2.internal\" DevicePath \"\"" Apr 16 16:25:39.645187 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:39.645126 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f15b3f4f-12e6-4033-8a06-87894739df95-installation-pull-secrets\") on node \"ip-10-0-141-93.ec2.internal\" DevicePath \"\"" Apr 16 16:25:40.497310 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:40.497273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" event={"ID":"f15b3f4f-12e6-4033-8a06-87894739df95","Type":"ContainerDied","Data":"896e0d7cd1086ddcc8f92f7cec480bcc742d5cdcbd8da33e9c0de16571a1af76"} Apr 16 16:25:40.497310 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:40.497302 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86cd4845fd-xltjh" Apr 16 16:25:40.497811 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:40.497321 2576 scope.go:117] "RemoveContainer" containerID="cbaefe1ee802ca94a92f630aadd54c0e2553d4933a12f9cae6b26211dbbf0262" Apr 16 16:25:40.521225 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:40.521203 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86cd4845fd-xltjh"] Apr 16 16:25:40.531416 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:40.531392 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-86cd4845fd-xltjh"] Apr 16 16:25:40.843705 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:40.843599 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15b3f4f-12e6-4033-8a06-87894739df95" path="/var/lib/kubelet/pods/f15b3f4f-12e6-4033-8a06-87894739df95/volumes" Apr 16 16:25:55.544615 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:55.544583 2576 generic.go:358] "Generic (PLEG): container finished" podID="7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22" containerID="4c61951564f9e7d950b26ef16a06dc1a167dc6e34a2054cf4083336a3ed25482" exitCode=0 Apr 16 16:25:55.544955 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:55.544656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" event={"ID":"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22","Type":"ContainerDied","Data":"4c61951564f9e7d950b26ef16a06dc1a167dc6e34a2054cf4083336a3ed25482"} Apr 16 16:25:55.544996 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:55.544970 2576 scope.go:117] "RemoveContainer" containerID="4c61951564f9e7d950b26ef16a06dc1a167dc6e34a2054cf4083336a3ed25482" Apr 16 16:25:56.549395 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:25:56.549356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-6rs86" event={"ID":"7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22","Type":"ContainerStarted","Data":"196ee27d50bd40f46c404572ebde2f94c294e2834f96116104aa9f7a30513043"} Apr 16 16:26:00.563329 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:26:00.563259 2576 generic.go:358] "Generic (PLEG): container finished" podID="f06b7a51-b7d8-47f8-ab24-e49d59b3cdad" containerID="11121651f1522732196472e19256bde5eda5a6bdd51d9450a2548e82b1477600" exitCode=0 Apr 16 16:26:00.563650 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:26:00.563335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" event={"ID":"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad","Type":"ContainerDied","Data":"11121651f1522732196472e19256bde5eda5a6bdd51d9450a2548e82b1477600"} Apr 16 16:26:00.563724 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:26:00.563691 2576 scope.go:117] "RemoveContainer" containerID="11121651f1522732196472e19256bde5eda5a6bdd51d9450a2548e82b1477600" Apr 16 16:26:01.567803 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:26:01.567769 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kklp6" event={"ID":"f06b7a51-b7d8-47f8-ab24-e49d59b3cdad","Type":"ContainerStarted","Data":"e82b47f9accdc12c3a903ca67b79f1851707270aafc5e798383b962b15c0432f"} Apr 16 16:26:05.581134 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:26:05.581109 2576 generic.go:358] "Generic (PLEG): container finished" podID="2e60d773-ddaa-48ec-b63d-69179db32795" containerID="5db9221b16a7121eb76db4a2d1ebf46c803a02a05c8ed81412da83fb32791a3f" exitCode=0 Apr 16 16:26:05.581418 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:26:05.581161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" event={"ID":"2e60d773-ddaa-48ec-b63d-69179db32795","Type":"ContainerDied","Data":"5db9221b16a7121eb76db4a2d1ebf46c803a02a05c8ed81412da83fb32791a3f"} Apr 16 16:26:05.581457 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:26:05.581442 2576 scope.go:117] "RemoveContainer" containerID="5db9221b16a7121eb76db4a2d1ebf46c803a02a05c8ed81412da83fb32791a3f" Apr 16 16:26:06.586334 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:26:06.586292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-pxgkx" event={"ID":"2e60d773-ddaa-48ec-b63d-69179db32795","Type":"ContainerStarted","Data":"0cca5fb1707a765479af6e0614160f1b1e696915a85660f6802d4edc690b8231"} Apr 16 16:28:46.752320 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:28:46.752284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:28:46.753056 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:28:46.753032 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:28:46.769136 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:28:46.769109 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:28:46.769931 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:28:46.769909 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:28:46.773405 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:28:46.773383 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:33:46.784828 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:33:46.784788 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:33:46.786488 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:33:46.786467 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:33:46.794436 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:33:46.794413 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:33:46.796444 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:33:46.796424 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:38:46.808623 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:38:46.808537 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:38:46.810459 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:38:46.810435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:38:46.815497 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:38:46.815472 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:38:46.817162 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:38:46.817143 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:43:46.829165 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:43:46.829125 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:43:46.831483 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:43:46.831460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:43:46.836240 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:43:46.836215 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:43:46.838948 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:43:46.838925 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:48:46.850944 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:48:46.850917 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:48:46.852985 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:48:46.852958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:48:46.857400 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:48:46.857379 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:48:46.859927 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:48:46.859904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:53:46.871427 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:53:46.871396 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:53:46.876354 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:53:46.876324 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:53:46.878798 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:53:46.878781 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:53:46.883510 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:53:46.883489 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:58:46.891671 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:58:46.891640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:58:46.897709 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:58:46.897670 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 16:58:46.898834 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:58:46.898815 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 16:58:46.904229 ip-10-0-141-93 kubenswrapper[2576]: I0416 16:58:46.904211 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:03:46.919301 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:03:46.919197 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 17:03:46.923208 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:03:46.922779 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 17:03:46.926893 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:03:46.926872 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:03:46.929700 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:03:46.929665 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:08:46.942056 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:08:46.941947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 17:08:46.946082 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:08:46.944128 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 17:08:46.949179 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:08:46.949163 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:08:46.950915 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:08:46.950897 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:13:46.963043 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:13:46.962947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 17:13:46.966914 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:13:46.965212 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 17:13:46.969658 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:13:46.969636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:13:46.971627 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:13:46.971611 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:18:46.983971 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:18:46.983852 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 17:18:46.988005 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:18:46.986851 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 17:18:46.997616 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:18:46.997592 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:18:47.000050 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:18:47.000031 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:22:15.285656 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:15.285588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-t98gk_883d7c41-dae5-4d36-b39e-13485dde73de/global-pull-secret-syncer/0.log" Apr 16 17:22:15.339124 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:15.339098 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2jhj4_063cc77b-5a11-4e5a-a733-15acf54a40e8/konnectivity-agent/0.log" Apr 16 17:22:15.488912 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:15.488892 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-93.ec2.internal_0b233db55a7ade7d393ce1a96715106e/haproxy/0.log" Apr 16 17:22:19.248488 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:19.248461 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-2kdhn_476e7e41-cee5-4dbf-bd5f-7d3a9ce62024/cluster-monitoring-operator/0.log" Apr 16 17:22:19.472297 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:19.472267 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pbj7p_98e2c059-ba06-4cd2-9a3a-a1db3da6abe8/node-exporter/0.log" Apr 16 17:22:19.490397 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:19.490365 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pbj7p_98e2c059-ba06-4cd2-9a3a-a1db3da6abe8/kube-rbac-proxy/0.log" Apr 16 17:22:19.511311 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:19.511257 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pbj7p_98e2c059-ba06-4cd2-9a3a-a1db3da6abe8/init-textfile/0.log" Apr 16 17:22:21.238361 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:21.238330 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-lt76w_daaa3a2c-4261-4fe7-8de0-322bed91df07/networking-console-plugin/0.log" Apr 16 17:22:21.661085 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:21.660988 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/1.log" Apr 16 17:22:21.669910 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:21.669883 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-6szqp_bfa69e98-1a05-4478-9f56-e8ca514789be/console-operator/2.log" Apr 16 17:22:22.453869 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.453839 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-7ql48_ad1a58a4-dd90-4915-8c28-27977a5f2692/volume-data-source-validator/0.log" Apr 16 17:22:22.660463 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.660434 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6"] Apr 16 17:22:22.660778 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.660763 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f15b3f4f-12e6-4033-8a06-87894739df95" containerName="registry" Apr 16 17:22:22.660829 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.660782 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15b3f4f-12e6-4033-8a06-87894739df95" containerName="registry" Apr 16 17:22:22.660867 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.660846 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f15b3f4f-12e6-4033-8a06-87894739df95" containerName="registry" Apr 16 17:22:22.663852 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.663832 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.666363 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.666339 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-65l7w\"/\"kube-root-ca.crt\"" Apr 16 17:22:22.667370 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.667349 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-65l7w\"/\"openshift-service-ca.crt\"" Apr 16 17:22:22.667370 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.667364 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-65l7w\"/\"default-dockercfg-srjc9\"" Apr 16 17:22:22.671280 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.671240 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6"] Apr 16 17:22:22.811894 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.811873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfh45\" (UniqueName: \"kubernetes.io/projected/649329f9-17d2-4a10-aa5f-13d197f95ecc-kube-api-access-jfh45\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.812009 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.811911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-podres\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.812009 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.811932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-lib-modules\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.812085 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.812018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-sys\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.812085 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.812046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-proc\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.912437 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.912413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfh45\" (UniqueName: \"kubernetes.io/projected/649329f9-17d2-4a10-aa5f-13d197f95ecc-kube-api-access-jfh45\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.912576 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.912460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-podres\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.912576 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.912504 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-lib-modules\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.912576 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.912554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-sys\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.912764 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.912581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-proc\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.912764 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.912644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-sys\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.912764 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.912661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-proc\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.912764 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.912705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-lib-modules\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.912764 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.912661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/649329f9-17d2-4a10-aa5f-13d197f95ecc-podres\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.920372 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.920353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfh45\" (UniqueName: \"kubernetes.io/projected/649329f9-17d2-4a10-aa5f-13d197f95ecc-kube-api-access-jfh45\") pod \"perf-node-gather-daemonset-5l8p6\" (UID: \"649329f9-17d2-4a10-aa5f-13d197f95ecc\") " pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:22.974417 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:22.974397 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:23.090457 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.090433 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6"] Apr 16 17:22:23.093047 ip-10-0-141-93 kubenswrapper[2576]: W0416 17:22:23.093020 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod649329f9_17d2_4a10_aa5f_13d197f95ecc.slice/crio-a26b31105d2125f266661061e2070f629e8cfdc75d49513222181f5511a89a3f WatchSource:0}: Error finding container a26b31105d2125f266661061e2070f629e8cfdc75d49513222181f5511a89a3f: Status 404 returned error can't find the container with id a26b31105d2125f266661061e2070f629e8cfdc75d49513222181f5511a89a3f Apr 16 17:22:23.094715 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.094693 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:22:23.100000 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.099954 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5wlnl_bfdb87ca-0166-4b3f-83a8-f352f00ae0c6/dns/0.log" Apr 16 17:22:23.120367 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.120351 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5wlnl_bfdb87ca-0166-4b3f-83a8-f352f00ae0c6/kube-rbac-proxy/0.log" Apr 16 17:22:23.251125 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.251102 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tqwdh_8593e95d-58e6-43c3-99b0-5582e1e25f39/dns-node-resolver/0.log" Apr 16 17:22:23.503609 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.503581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" event={"ID":"649329f9-17d2-4a10-aa5f-13d197f95ecc","Type":"ContainerStarted","Data":"0506bd8b99012c92469a39c7f826fda3bc1ff4f6c22079ad619bededfbe31054"} Apr 16 17:22:23.504003 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.503616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" event={"ID":"649329f9-17d2-4a10-aa5f-13d197f95ecc","Type":"ContainerStarted","Data":"a26b31105d2125f266661061e2070f629e8cfdc75d49513222181f5511a89a3f"} Apr 16 17:22:23.504003 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.503794 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:23.520124 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.520086 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" podStartSLOduration=1.520072575 podStartE2EDuration="1.520072575s" podCreationTimestamp="2026-04-16 17:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:22:23.518378754 +0000 UTC m=+3517.307726914" watchObservedRunningTime="2026-04-16 17:22:23.520072575 +0000 UTC m=+3517.309420699" Apr 16 17:22:23.756776 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:23.756704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zfglr_32a91c59-c74e-45df-ab79-de8449b1b1e3/node-ca/0.log" Apr 16 17:22:24.429648 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:24.429616 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5dc6546dd6-bm7n7_22155b83-35e6-40ea-8b7e-1fc752b875eb/router/0.log" Apr 16 17:22:24.793510 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:24.793479 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nnptv_ddb0d5c7-7432-473d-a4e0-7822ea15651e/serve-healthcheck-canary/0.log" Apr 16 17:22:25.143124 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:25.143002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-pxgkx_2e60d773-ddaa-48ec-b63d-69179db32795/insights-operator/0.log" Apr 16 17:22:25.144270 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:25.144243 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-pxgkx_2e60d773-ddaa-48ec-b63d-69179db32795/insights-operator/1.log" Apr 16 17:22:25.227191 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:25.227155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sds72_4aea228b-8508-454b-802b-1ec5cc8a6795/kube-rbac-proxy/0.log" Apr 16 17:22:25.245717 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:25.245696 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sds72_4aea228b-8508-454b-802b-1ec5cc8a6795/exporter/0.log" Apr 16 17:22:25.264950 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:25.264928 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-sds72_4aea228b-8508-454b-802b-1ec5cc8a6795/extractor/0.log" Apr 16 17:22:29.517518 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:29.517494 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-65l7w/perf-node-gather-daemonset-5l8p6" Apr 16 17:22:31.174876 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:31.174850 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-scfkx_60f9dc94-90f1-4b02-9ea3-7169126c399f/migrator/0.log" Apr 16 17:22:31.192264 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:31.192244 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-scfkx_60f9dc94-90f1-4b02-9ea3-7169126c399f/graceful-termination/0.log" Apr 16 17:22:31.545791 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:31.545751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-6rs86_7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22/kube-storage-version-migrator-operator/1.log" Apr 16 17:22:31.547122 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:31.547102 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-6rs86_7f8bcfec-4683-4c6e-9b31-ee63b9f0ff22/kube-storage-version-migrator-operator/0.log" Apr 16 17:22:32.749498 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:32.749474 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kpjzv_51f23044-6510-4235-820f-fdca93d4bab6/kube-multus-additional-cni-plugins/0.log" Apr 16 17:22:32.771767 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:32.771742 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kpjzv_51f23044-6510-4235-820f-fdca93d4bab6/egress-router-binary-copy/0.log" Apr 16 17:22:32.796071 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:32.796049 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kpjzv_51f23044-6510-4235-820f-fdca93d4bab6/cni-plugins/0.log" Apr 16 17:22:32.818514 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:32.818491 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kpjzv_51f23044-6510-4235-820f-fdca93d4bab6/bond-cni-plugin/0.log" Apr 16 17:22:32.839011 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:32.838977 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kpjzv_51f23044-6510-4235-820f-fdca93d4bab6/routeoverride-cni/0.log" Apr 16 17:22:32.863658 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:32.863637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kpjzv_51f23044-6510-4235-820f-fdca93d4bab6/whereabouts-cni-bincopy/0.log" Apr 16 17:22:32.885916 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:32.885895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kpjzv_51f23044-6510-4235-820f-fdca93d4bab6/whereabouts-cni/0.log" Apr 16 17:22:33.263500 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:33.263475 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2nrk_6e7fea14-3672-4005-bbe8-e59d933d3173/kube-multus/0.log" Apr 16 17:22:33.380580 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:33.380557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6dvl8_6f605cef-2cd0-4480-b5a0-4bb58f196ac7/network-metrics-daemon/0.log" Apr 16 17:22:33.403390 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:33.403372 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6dvl8_6f605cef-2cd0-4480-b5a0-4bb58f196ac7/kube-rbac-proxy/0.log" Apr 16 17:22:34.243832 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:34.243803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-controller/0.log" Apr 16 17:22:34.258454 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:34.258435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/0.log" Apr 16 17:22:34.289971 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:34.289949 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovn-acl-logging/1.log" Apr 16 17:22:34.307718 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:34.307692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/kube-rbac-proxy-node/0.log" Apr 16 17:22:34.325958 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:34.325938 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:22:34.345964 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:34.345947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/northd/0.log" Apr 16 17:22:34.362831 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:34.362803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/nbdb/0.log" Apr 16 17:22:34.381629 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:34.381603 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/sbdb/0.log" Apr 16 17:22:34.545981 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:34.545931 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9ctsd_c2efb54c-bc99-4126-ac22-6f7a17d6cd42/ovnkube-controller/0.log" Apr 16 17:22:36.144324 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:36.144297 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-mht57_b4dd6fba-5a94-4e3e-92e2-74610c8a58bf/check-endpoints/0.log" Apr 16 17:22:36.209917 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:36.209893 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-t7shk_903cab10-206d-4fef-bebf-bbf8db046d19/network-check-target-container/0.log" Apr 16 17:22:37.079572 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:37.079547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-r5v2l_ddc36c0b-3388-4a0a-a038-6e0a618d18c2/iptables-alerter/0.log" Apr 16 17:22:37.699313 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:37.699289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-drsbg_ee219be2-6e0e-45ac-874e-43970e574181/tuned/0.log" Apr 16 17:22:39.397669 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:39.397637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-vhnrp_4b370c20-7985-47c9-b2b1-e685ab180a6e/cluster-samples-operator/0.log" Apr 16 17:22:39.413781 ip-10-0-141-93 kubenswrapper[2576]: I0416 17:22:39.413734 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-vhnrp_4b370c20-7985-47c9-b2b1-e685ab180a6e/cluster-samples-operator-watch/0.log"