Apr 20 12:11:44.068478 ip-10-0-131-55 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 12:11:44.068490 ip-10-0-131-55 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 12:11:44.068497 ip-10-0-131-55 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 12:11:44.068727 ip-10-0-131-55 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 12:11:54.317936 ip-10-0-131-55 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 12:11:54.317960 ip-10-0-131-55 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 16d63904ef2145f49435def716085c95 -- Apr 20 12:13:54.314904 ip-10-0-131-55 systemd[1]: Starting Kubernetes Kubelet... Apr 20 12:13:54.817746 ip-10-0-131-55 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 12:13:54.817746 ip-10-0-131-55 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 12:13:54.817746 ip-10-0-131-55 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 12:13:54.817746 ip-10-0-131-55 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 12:13:54.817746 ip-10-0-131-55 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 12:13:54.821542 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.821450 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 12:13:54.827537 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827521 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:13:54.827537 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827537 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827540 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827544 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827547 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827550 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827554 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827557 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827560 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827563 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827565 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827568 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827570 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827573 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827581 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827584 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827587 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827589 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827592 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827595 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:13:54.827606 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827597 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827600 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827602 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827605 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827607 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827610 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827613 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827616 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827618 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827621 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827624 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827627 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827631 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827649 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827653 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827656 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827658 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827662 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827665 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:13:54.828063 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827668 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827670 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827672 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827675 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827677 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827680 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827682 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827685 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827687 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827690 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827695 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827698 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827700 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827703 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827705 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827709 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827711 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827714 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827716 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827719 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:13:54.828510 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827722 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827724 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827727 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827729 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827732 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827734 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827737 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827739 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827742 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827744 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827747 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827750 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827753 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827755 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827758 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827761 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827763 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827766 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827768 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827770 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:13:54.829005 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827773 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827775 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827777 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827780 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827782 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827786 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.827789 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828198 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828202 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828205 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828208 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828210 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828213 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828216 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828218 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828221 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828223 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828226 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828228 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828231 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:13:54.829477 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828233 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828236 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828238 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828241 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828244 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828246 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828249 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828251 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828254 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828256 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828258 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828261 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828263 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828266 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828268 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828270 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828273 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828275 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828278 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828281 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:13:54.830003 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828284 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828286 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828289 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828292 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828294 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828296 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828299 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828301 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828304 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828307 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828311 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828314 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828316 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828319 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828321 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828324 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828327 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828329 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:13:54.830591 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828333 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828336 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828338 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828341 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828343 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828346 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828348 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828350 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828353 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828355 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828358 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828360 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828364 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828368 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828370 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828373 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828375 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828378 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828380 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828383 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:13:54.831040 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828385 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828388 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828390 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828392 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828395 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828397 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828400 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828402 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828405 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828407 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828410 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828414 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828416 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828419 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.828421 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828499 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828506 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828513 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828518 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828523 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828526 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 12:13:54.831529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828530 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828535 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828539 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828542 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828547 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828550 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828553 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828556 2577 flags.go:64] FLAG: --cgroup-root="" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828559 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828562 2577 flags.go:64] FLAG: --client-ca-file="" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828565 2577 flags.go:64] FLAG: --cloud-config="" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828568 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828571 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828575 2577 flags.go:64] FLAG: --cluster-domain="" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828578 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828581 2577 flags.go:64] FLAG: --config-dir="" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828584 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828587 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828591 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828594 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828597 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828601 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828604 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828607 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 12:13:54.832049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828610 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828614 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828616 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828621 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828624 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828627 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828630 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828633 2577 flags.go:64] FLAG: --enable-server="true" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828650 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828655 2577 flags.go:64] FLAG: --event-burst="100" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828658 2577 flags.go:64] FLAG: --event-qps="50" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828661 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828664 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828667 2577 flags.go:64] FLAG: --eviction-hard="" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828671 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828674 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828677 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828680 2577 flags.go:64] FLAG: --eviction-soft="" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828683 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828686 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828688 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828691 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828694 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828697 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828700 2577 flags.go:64] FLAG: --feature-gates="" Apr 20 12:13:54.832621 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828703 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828706 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828709 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828712 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828715 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828718 2577 flags.go:64] FLAG: --help="false" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828721 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-131-55.ec2.internal" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828725 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828728 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828731 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828734 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828738 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828741 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828743 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828746 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828749 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828752 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828755 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828758 2577 flags.go:64] FLAG: --kube-reserved="" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828761 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828763 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828766 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828769 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828772 2577 flags.go:64] FLAG: --lock-file="" Apr 20 12:13:54.833223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828774 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828777 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828780 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828786 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828788 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828791 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828794 2577 flags.go:64] FLAG: --logging-format="text" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828797 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828800 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828803 2577 flags.go:64] FLAG: --manifest-url="" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828806 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828810 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828813 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828817 2577 flags.go:64] FLAG: --max-pods="110" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828820 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828822 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828825 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828829 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828833 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828836 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828839 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828846 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828849 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828852 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 12:13:54.833823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828855 2577 flags.go:64] FLAG: --pod-cidr="" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828858 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828864 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828867 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828870 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828873 2577 flags.go:64] FLAG: --port="10250" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828876 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828879 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-069b983b3ed5b3ea8" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828882 2577 flags.go:64] FLAG: --qos-reserved="" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828885 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828887 2577 flags.go:64] FLAG: --register-node="true" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828890 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828893 2577 flags.go:64] FLAG: --register-with-taints="" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828896 2577 flags.go:64] FLAG: --registry-burst="10" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828899 2577 flags.go:64] FLAG: --registry-qps="5" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828902 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828904 2577 flags.go:64] FLAG: --reserved-memory="" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828908 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828911 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828914 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828917 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828920 2577 flags.go:64] FLAG: --runonce="false" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828922 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828925 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828928 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 20 12:13:54.834405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828932 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828939 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828942 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828945 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828948 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828951 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828954 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828957 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828959 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828962 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828965 2577 flags.go:64] FLAG: --system-cgroups="" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828968 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828974 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828977 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828980 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828983 2577 flags.go:64] FLAG: --tls-min-version="" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828986 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828989 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828992 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828995 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.828997 2577 flags.go:64] FLAG: --v="2" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.829002 2577 flags.go:64] FLAG: --version="false" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.829006 2577 flags.go:64] FLAG: --vmodule="" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.829010 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.829014 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 12:13:54.835053 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829109 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829113 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829116 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829118 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829121 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829124 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829126 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829129 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829134 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829137 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829140 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829142 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829145 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829147 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829150 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829152 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829155 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829157 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829160 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829162 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:13:54.835716 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829165 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829167 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829170 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829172 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829175 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829177 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829180 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829183 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829185 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829187 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829190 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829193 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829195 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829198 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829200 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829202 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829205 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829207 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829210 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829212 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:13:54.836378 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829216 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829220 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829222 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829225 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829227 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829230 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829232 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829236 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829240 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829242 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829245 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829248 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829251 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829253 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829256 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829258 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829260 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829263 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829265 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:13:54.837233 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829268 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829270 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829273 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829276 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829278 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829280 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829283 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829285 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829288 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829291 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829293 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829296 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829298 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829303 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829305 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829308 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829311 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829313 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829315 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:13:54.838046 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829318 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829320 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829324 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829327 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829330 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829333 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829335 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.829338 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.830195 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.838427 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.838450 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838601 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838611 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838619 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838626 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:13:54.838844 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838632 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838656 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838662 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838667 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838672 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838677 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838687 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838693 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838697 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838701 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838706 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838710 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838715 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838719 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838723 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838728 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838732 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838737 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838748 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838752 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:13:54.839312 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838757 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838761 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838765 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838770 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838775 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838779 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838784 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838791 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838796 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838801 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838805 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838815 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838819 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838823 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838827 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838832 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838838 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838845 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838850 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:13:54.839831 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838854 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838859 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838864 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838869 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838879 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838884 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838889 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838894 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838900 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838966 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838972 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.838997 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839001 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839004 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839012 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839132 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839142 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839146 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839149 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839152 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:13:54.840273 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839156 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839159 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839162 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839165 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839169 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839172 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839174 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839178 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839181 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839184 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839186 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839189 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839192 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839194 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839198 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839200 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839203 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839205 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839208 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839211 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:13:54.840787 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839213 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839216 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839219 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.839225 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839335 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839341 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839344 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839348 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839350 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839354 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839356 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839359 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839361 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839364 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839367 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 12:13:54.841262 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839370 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839372 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839375 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839377 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839380 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839382 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839385 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839387 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839390 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839392 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839395 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839397 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839400 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839412 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839415 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839418 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839420 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839423 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839427 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839429 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 12:13:54.841632 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839432 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839436 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839440 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839443 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839447 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839451 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839455 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839458 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839460 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839463 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839466 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839469 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839472 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839474 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839477 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839480 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839482 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839485 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839488 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 12:13:54.842130 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839490 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839493 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839495 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839497 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839500 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839502 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839505 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839507 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839509 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839512 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839514 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839517 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839519 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839522 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839524 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839527 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839530 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839532 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839535 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839538 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 12:13:54.842597 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839540 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839543 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839545 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839548 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839550 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839553 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839555 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839558 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839560 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839563 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839565 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839567 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839570 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839572 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839575 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.839577 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 12:13:54.843091 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.839582 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 12:13:54.843492 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.840492 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 12:13:54.845730 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.845716 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 12:13:54.847872 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.847861 2577 server.go:1019] "Starting client certificate rotation" Apr 20 12:13:54.847981 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.847965 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 12:13:54.848013 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.848001 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 12:13:54.876310 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.876291 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 12:13:54.878835 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.878808 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 12:13:54.897819 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.897805 2577 log.go:25] "Validated CRI v1 runtime API" Apr 20 12:13:54.903793 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.903779 2577 log.go:25] "Validated CRI v1 image API" Apr 20 12:13:54.905151 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.905134 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 12:13:54.909707 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.909688 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 12:13:54.911810 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.911792 2577 fs.go:135] Filesystem UUIDs: map[45aa7f6e-e079-4cd3-a113-003d0745705f:/dev/nvme0n1p4 59234494-9c5c-49c4-bcd4-efb34534267e:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 20 12:13:54.911854 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.911811 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 12:13:54.918434 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.918320 2577 manager.go:217] Machine: {Timestamp:2026-04-20 12:13:54.915793962 +0000 UTC m=+0.468018141 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3110614 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26fe260601cc00e8f11c1cb3e8ec5f SystemUUID:ec26fe26-0601-cc00-e8f1-1c1cb3e8ec5f BootID:16d63904-ef21-45f4-9435-def716085c95 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f9:f4:18:fd:d1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f9:f4:18:fd:d1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7e:dc:41:f3:dc:e2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 12:13:54.918434 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.918424 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 12:13:54.918566 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.918525 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 12:13:54.919672 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.919651 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 12:13:54.919809 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.919674 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-55.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 12:13:54.919856 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.919818 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 12:13:54.919856 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.919826 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 12:13:54.919856 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.919839 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 12:13:54.919856 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.919854 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 12:13:54.921497 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.921487 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 12:13:54.921800 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.921790 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 12:13:54.924446 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.924437 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 20 12:13:54.924506 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.924451 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 12:13:54.924506 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.924464 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 12:13:54.924506 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.924473 2577 kubelet.go:397] "Adding apiserver pod source" Apr 20 12:13:54.924506 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.924483 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 12:13:54.925913 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.925898 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 12:13:54.925913 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.925916 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 12:13:54.929990 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.929969 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 12:13:54.931388 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.931375 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 12:13:54.933379 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933365 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 12:13:54.933379 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933382 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933389 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933394 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933400 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933405 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933410 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933416 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933424 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933432 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933441 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 12:13:54.933472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.933449 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 12:13:54.934989 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.934977 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 12:13:54.935028 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.934991 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 12:13:54.937862 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.937843 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-55.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 12:13:54.938049 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:54.938020 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 12:13:54.938138 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:54.938067 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 12:13:54.938972 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.938959 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 12:13:54.939023 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.938994 2577 server.go:1295] "Started kubelet" Apr 20 12:13:54.939134 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.939075 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 12:13:54.939187 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.939140 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 12:13:54.939220 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.939206 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 12:13:54.939885 ip-10-0-131-55 systemd[1]: Started Kubernetes Kubelet. Apr 20 12:13:54.941885 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.941865 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 20 12:13:54.943084 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.943067 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 12:13:54.945491 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:54.944396 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-55.ec2.internal.18a80fa07b822879 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-55.ec2.internal,UID:ip-10-0-131-55.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-55.ec2.internal,},FirstTimestamp:2026-04-20 12:13:54.938972281 +0000 UTC m=+0.491196460,LastTimestamp:2026-04-20 12:13:54.938972281 +0000 UTC m=+0.491196460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-55.ec2.internal,}" Apr 20 12:13:54.948442 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:54.948424 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 12:13:54.950818 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.950799 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 12:13:54.951400 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.951381 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 12:13:54.952587 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.952555 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w8nvw" Apr 20 12:13:54.953296 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.953265 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 12:13:54.953296 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.953298 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 12:13:54.953453 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:54.953350 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 20 12:13:54.953507 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.953463 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 20 12:13:54.953507 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.953473 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 20 12:13:54.953751 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.953732 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 12:13:54.953865 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.953849 2577 factory.go:55] Registering systemd factory Apr 20 12:13:54.953932 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.953917 2577 factory.go:223] Registration of the systemd container factory successfully Apr 20 12:13:54.954353 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.954334 2577 factory.go:153] Registering CRI-O factory Apr 20 12:13:54.954353 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.954352 2577 factory.go:223] Registration of the crio container factory successfully Apr 20 12:13:54.954451 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.954403 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 12:13:54.954486 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.954455 2577 factory.go:103] Registering Raw factory Apr 20 12:13:54.954486 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.954471 2577 manager.go:1196] Started watching for new ooms in manager Apr 20 12:13:54.954846 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.954836 2577 manager.go:319] Starting recovery of all containers Apr 20 12:13:54.955051 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:54.955031 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 12:13:54.955776 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:54.955699 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 12:13:54.962860 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:54.962669 2577 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.min": read /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.min: no such device Apr 20 12:13:54.963969 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.963949 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w8nvw" Apr 20 12:13:54.966767 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.966751 2577 manager.go:324] Recovery completed Apr 20 12:13:54.970820 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.970808 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:13:54.973349 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.973336 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:13:54.973409 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.973364 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:13:54.973409 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.973375 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:13:54.973875 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.973860 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 12:13:54.973875 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.973873 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 12:13:54.973951 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.973887 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 12:13:54.975377 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:54.975295 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-55.ec2.internal.18a80fa07d8eb8e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-55.ec2.internal,UID:ip-10-0-131-55.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-55.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-55.ec2.internal,},FirstTimestamp:2026-04-20 12:13:54.973350121 +0000 UTC m=+0.525574300,LastTimestamp:2026-04-20 12:13:54.973350121 +0000 UTC m=+0.525574300,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-55.ec2.internal,}" Apr 20 12:13:54.977686 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.977674 2577 policy_none.go:49] "None policy: Start" Apr 20 12:13:54.977747 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.977690 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 12:13:54.977747 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:54.977699 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 20 12:13:55.017202 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.017187 2577 manager.go:341] "Starting Device Plugin manager" Apr 20 12:13:55.039126 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.017232 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 12:13:55.039126 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.017246 2577 server.go:85] "Starting device plugin registration server" Apr 20 12:13:55.039126 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.017449 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 12:13:55.039126 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.017462 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 12:13:55.039126 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.017603 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 12:13:55.039126 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.017692 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 12:13:55.039126 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.017700 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 12:13:55.039126 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.018295 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 12:13:55.039126 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.018340 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-55.ec2.internal\" not found" Apr 20 12:13:55.064423 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.064395 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 12:13:55.065728 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.065706 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 12:13:55.065797 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.065747 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 12:13:55.065797 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.065768 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 12:13:55.065797 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.065776 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 12:13:55.065903 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.065812 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 12:13:55.067864 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.067819 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:13:55.117772 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.117753 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:13:55.118754 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.118738 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:13:55.118830 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.118770 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:13:55.118830 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.118780 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:13:55.118830 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.118804 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.127415 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.127401 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.127465 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.127421 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-55.ec2.internal\": node \"ip-10-0-131-55.ec2.internal\" not found" Apr 20 12:13:55.162158 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.162133 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 20 12:13:55.166448 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.166425 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal"] Apr 20 12:13:55.166527 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.166500 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:13:55.167258 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.167243 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:13:55.167333 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.167274 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:13:55.167333 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.167289 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:13:55.169674 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.169659 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:13:55.169774 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.169760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.169824 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.169786 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:13:55.170412 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.170395 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:13:55.170472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.170430 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:13:55.170472 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.170395 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:13:55.170558 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.170470 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:13:55.170558 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.170445 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:13:55.170558 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.170490 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:13:55.173228 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.173213 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.173305 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.173238 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 12:13:55.173924 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.173910 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 12:13:55.173989 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.173939 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 12:13:55.173989 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.173954 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 12:13:55.200247 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.200228 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-55.ec2.internal\" not found" node="ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.203579 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.203565 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-55.ec2.internal\" not found" node="ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.262919 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.262900 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 20 12:13:55.355348 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.355301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec74c72a1ae3da2b3b1eef59bb72e15d-config\") pod \"kube-apiserver-proxy-ip-10-0-131-55.ec2.internal\" (UID: \"ec74c72a1ae3da2b3b1eef59bb72e15d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.355348 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.355326 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.355348 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.355346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.363386 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.363371 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 20 12:13:55.456053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.456032 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.456154 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.456058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.456154 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.456075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec74c72a1ae3da2b3b1eef59bb72e15d-config\") pod \"kube-apiserver-proxy-ip-10-0-131-55.ec2.internal\" (UID: \"ec74c72a1ae3da2b3b1eef59bb72e15d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.456154 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.456108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec74c72a1ae3da2b3b1eef59bb72e15d-config\") pod \"kube-apiserver-proxy-ip-10-0-131-55.ec2.internal\" (UID: \"ec74c72a1ae3da2b3b1eef59bb72e15d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.456154 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.456121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.456154 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.456127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3dfc4ac8bf9ace5a036e90036a3f5792-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal\" (UID: \"3dfc4ac8bf9ace5a036e90036a3f5792\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.464148 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.464123 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 20 12:13:55.502302 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.502287 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.505594 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.505579 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.564714 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.564685 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 20 12:13:55.665284 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.665227 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-55.ec2.internal\" not found" Apr 20 12:13:55.746682 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.746632 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:13:55.753363 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.753346 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.760723 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.760707 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 12:13:55.762825 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.762812 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" Apr 20 12:13:55.777699 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.777686 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 12:13:55.787812 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.787696 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:13:55.846926 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.846906 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 12:13:55.847553 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.847033 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 12:13:55.847553 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.847079 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 12:13:55.925166 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.925100 2577 apiserver.go:52] "Watching apiserver" Apr 20 12:13:55.932415 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.932398 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 12:13:55.933461 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.933377 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-vwq8z","kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2","openshift-cluster-node-tuning-operator/tuned-hcg2g","openshift-dns/node-resolver-bj5lp","openshift-multus/network-metrics-daemon-jnnsm","openshift-network-diagnostics/network-check-target-gbv8h","openshift-network-operator/iptables-alerter-j2n82","openshift-image-registry/node-ca-zdf9s","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal","openshift-multus/multus-additional-cni-plugins-d4m89","openshift-multus/multus-dfjcx","openshift-ovn-kubernetes/ovnkube-node-zmhl5"] Apr 20 12:13:55.936412 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.936391 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:13:55.938522 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.938503 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:55.938771 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.938751 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 12:13:55.938771 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.938768 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 12:13:55.938903 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.938768 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zpmrc\"" Apr 20 12:13:55.940633 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.940616 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sxnmk\"" Apr 20 12:13:55.940728 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.940711 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 12:13:55.941009 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.940986 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 12:13:55.941009 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.941005 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 12:13:55.941340 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.941326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.943412 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.943394 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:55.943577 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.943558 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:13:55.943577 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.943564 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 12:13:55.943717 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.943585 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qsd9s\"" Apr 20 12:13:55.945501 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.945479 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 12:13:55.945592 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.945482 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-w7txh\"" Apr 20 12:13:55.945669 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.945595 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:55.945727 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.945685 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:13:55.946163 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.946150 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 12:13:55.947808 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.947793 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:13:55.947867 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:55.947844 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:13:55.950663 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.950627 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:55.950889 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.950867 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 12:13:55.952936 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.952921 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:55.954124 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.954108 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:13:55.954293 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.954282 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 12:13:55.954478 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.954465 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 12:13:55.954514 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.954490 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9mpdr\"" Apr 20 12:13:55.954961 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.954939 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 12:13:55.955054 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.955017 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rh5tl\"" Apr 20 12:13:55.955054 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.955030 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 12:13:55.955285 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.955270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:55.955963 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.955944 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 12:13:55.957583 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.957566 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 12:13:55.957690 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.957583 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nngnt\"" Apr 20 12:13:55.957690 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.957653 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 12:13:55.957883 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.957699 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 12:13:55.957883 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.957835 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 12:13:55.958107 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.958069 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-host\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.958172 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.958124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:55.958172 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.958150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2whz\" (UniqueName: \"kubernetes.io/projected/016f5832-4461-44e1-b03e-5ca0dc88515d-kube-api-access-z2whz\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:55.958243 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.958176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-os-release\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:55.958243 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.958204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fbe4372b-f6e0-4562-b969-fe5fdeed773a-konnectivity-ca\") pod \"konnectivity-agent-vwq8z\" (UID: \"fbe4372b-f6e0-4562-b969-fe5fdeed773a\") " pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:13:55.958303 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.958267 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjnnp\" (UniqueName: \"kubernetes.io/projected/72c6c42a-45e7-4a4c-8577-2984a8123380-kube-api-access-fjnnp\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.958333 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.958309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/24b836ac-13ec-49aa-be4b-4250c8e79676-tmp-dir\") pod \"node-resolver-bj5lp\" (UID: \"24b836ac-13ec-49aa-be4b-4250c8e79676\") " pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:55.959103 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.958696 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 12:13:55.959103 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a182a959-9bf8-48fe-b024-32a9f697eb23-host\") pod \"node-ca-zdf9s\" (UID: \"a182a959-9bf8-48fe-b024-32a9f697eb23\") " pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:55.959239 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvqsv\" (UniqueName: \"kubernetes.io/projected/a182a959-9bf8-48fe-b024-32a9f697eb23-kube-api-access-cvqsv\") pod \"node-ca-zdf9s\" (UID: \"a182a959-9bf8-48fe-b024-32a9f697eb23\") " pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:55.959239 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:13:55.959239 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959236 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-sysctl-conf\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.959377 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/272f753b-f685-4425-8290-d42ee3ab9738-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:55.959377 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959316 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-modprobe-d\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.959377 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959345 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-kubernetes\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.959508 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-sysctl-d\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.959564 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dfjcx" Apr 20 12:13:55.959623 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8psnt\" (UniqueName: \"kubernetes.io/projected/24b836ac-13ec-49aa-be4b-4250c8e79676-kube-api-access-8psnt\") pod \"node-resolver-bj5lp\" (UID: \"24b836ac-13ec-49aa-be4b-4250c8e79676\") " pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:55.960696 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.959659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d752e3f9-624e-42e0-8b33-7285148161c0-iptables-alerter-script\") pod \"iptables-alerter-j2n82\" (UID: \"d752e3f9-624e-42e0-8b33-7285148161c0\") " pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:55.960696 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.960305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-system-cni-dir\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:55.960696 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.960336 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-cnibin\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:55.960696 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.960502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d752e3f9-624e-42e0-8b33-7285148161c0-host-slash\") pod \"iptables-alerter-j2n82\" (UID: \"d752e3f9-624e-42e0-8b33-7285148161c0\") " pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:55.960696 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.960573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-lib-modules\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.960922 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.960681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/24b836ac-13ec-49aa-be4b-4250c8e79676-hosts-file\") pod \"node-resolver-bj5lp\" (UID: \"24b836ac-13ec-49aa-be4b-4250c8e79676\") " pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:55.961543 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-socket-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:55.961673 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961579 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-device-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:55.961673 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961606 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-etc-selinux\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:55.961801 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72c6c42a-45e7-4a4c-8577-2984a8123380-tmp\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.961801 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a182a959-9bf8-48fe-b024-32a9f697eb23-serviceca\") pod \"node-ca-zdf9s\" (UID: \"a182a959-9bf8-48fe-b024-32a9f697eb23\") " pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:55.961894 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/272f753b-f685-4425-8290-d42ee3ab9738-cni-binary-copy\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:55.961894 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:55.962004 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/272f753b-f685-4425-8290-d42ee3ab9738-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:55.962004 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961921 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fbe4372b-f6e0-4562-b969-fe5fdeed773a-agent-certs\") pod \"konnectivity-agent-vwq8z\" (UID: \"fbe4372b-f6e0-4562-b969-fe5fdeed773a\") " pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:13:55.962004 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961948 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-run\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.962004 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.961976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-tuned\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.962249 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962004 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-systemd\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.962249 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj6nd\" (UniqueName: \"kubernetes.io/projected/272f753b-f685-4425-8290-d42ee3ab9738-kube-api-access-pj6nd\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:55.962249 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:55.962249 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-registration-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:55.962249 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-sys-fs\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:55.962249 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962144 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2blst\" (UniqueName: \"kubernetes.io/projected/a60f63c2-4c9f-486c-b55b-1436aae015dc-kube-api-access-2blst\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:55.962249 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t67b2\" (UniqueName: \"kubernetes.io/projected/d752e3f9-624e-42e0-8b33-7285148161c0-kube-api-access-t67b2\") pod \"iptables-alerter-j2n82\" (UID: \"d752e3f9-624e-42e0-8b33-7285148161c0\") " pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:55.962249 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962195 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-sysconfig\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.962249 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962221 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-sys\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.962835 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.962288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-var-lib-kubelet\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:55.963039 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.963017 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gqd4g\"" Apr 20 12:13:55.963488 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.963469 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:55.963595 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.963477 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 12:13:55.964810 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.964792 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 12:13:55.965831 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.965813 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 12:13:55.966854 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.966816 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 12:08:54 +0000 UTC" deadline="2027-11-26 18:44:01.511751084 +0000 UTC" Apr 20 12:13:55.966854 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.966847 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14046h30m5.544906555s" Apr 20 12:13:55.966989 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.966967 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 12:13:55.966989 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.966983 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 12:13:55.967104 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.966986 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 12:13:55.967400 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.967386 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 12:13:55.967448 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.967402 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rjm22\"" Apr 20 12:13:55.967491 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:55.967478 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 12:13:56.020777 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.020749 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rqn48" Apr 20 12:13:56.027721 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.027699 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rqn48" Apr 20 12:13:56.054758 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.054730 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 12:13:56.063107 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t67b2\" (UniqueName: \"kubernetes.io/projected/d752e3f9-624e-42e0-8b33-7285148161c0-kube-api-access-t67b2\") pod \"iptables-alerter-j2n82\" (UID: \"d752e3f9-624e-42e0-8b33-7285148161c0\") " pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:56.063193 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063115 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-sysconfig\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.063193 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-sys\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.063193 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2whz\" (UniqueName: \"kubernetes.io/projected/016f5832-4461-44e1-b03e-5ca0dc88515d-kube-api-access-z2whz\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:56.063193 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-os-release\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.063351 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-sys\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.063351 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-os-release\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.063351 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-sysconfig\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.063495 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-run-openvswitch\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.063495 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjnnp\" (UniqueName: \"kubernetes.io/projected/72c6c42a-45e7-4a4c-8577-2984a8123380-kube-api-access-fjnnp\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.063495 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/24b836ac-13ec-49aa-be4b-4250c8e79676-tmp-dir\") pod \"node-resolver-bj5lp\" (UID: \"24b836ac-13ec-49aa-be4b-4250c8e79676\") " pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:56.063495 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvqsv\" (UniqueName: \"kubernetes.io/projected/a182a959-9bf8-48fe-b024-32a9f697eb23-kube-api-access-cvqsv\") pod \"node-ca-zdf9s\" (UID: \"a182a959-9bf8-48fe-b024-32a9f697eb23\") " pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:56.063740 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:13:56.063740 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-var-lib-cni-multus\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.063740 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-var-lib-kubelet\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.063740 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063602 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-conf-dir\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.063740 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-etc-kubernetes\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.063740 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063672 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/24b836ac-13ec-49aa-be4b-4250c8e79676-tmp-dir\") pod \"node-resolver-bj5lp\" (UID: \"24b836ac-13ec-49aa-be4b-4250c8e79676\") " pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:56.063740 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-sysctl-conf\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/272f753b-f685-4425-8290-d42ee3ab9738-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063787 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d527830-9151-40c6-884f-3c8497f96667-ovnkube-script-lib\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-sysctl-conf\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-modprobe-d\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-sysctl-d\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-system-cni-dir\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-cnibin\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-device-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-etc-openvswitch\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-system-cni-dir\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-cnibin\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063975 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d527830-9151-40c6-884f-3c8497f96667-env-overrides\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-modprobe-d\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.063982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-device-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-sysctl-d\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc64x\" (UniqueName: \"kubernetes.io/projected/2d527830-9151-40c6-884f-3c8497f96667-kube-api-access-tc64x\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.064065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-socket-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064069 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-run-k8s-cni-cncf-io\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064107 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-var-lib-cni-bin\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064153 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-slash\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-socket-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-run-netns\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-run\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064260 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-cni-dir\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-run\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/272f753b-f685-4425-8290-d42ee3ab9738-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-systemd-units\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pj6nd\" (UniqueName: \"kubernetes.io/projected/272f753b-f685-4425-8290-d42ee3ab9738-kube-api-access-pj6nd\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064397 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064424 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-sys-fs\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2blst\" (UniqueName: \"kubernetes.io/projected/a60f63c2-4c9f-486c-b55b-1436aae015dc-kube-api-access-2blst\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064490 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-cnibin\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.064958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064533 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-hostroot\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064536 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-sys-fs\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064567 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn6vb\" (UniqueName: \"kubernetes.io/projected/f58cdd5e-92df-4b3b-b634-065a2b1275f5-kube-api-access-wn6vb\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-var-lib-kubelet\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064717 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-var-lib-kubelet\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-host\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-system-cni-dir\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-var-lib-openvswitch\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-host\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.064852 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-node-log\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-log-socket\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-cni-bin\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.064991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fbe4372b-f6e0-4562-b969-fe5fdeed773a-konnectivity-ca\") pod \"konnectivity-agent-vwq8z\" (UID: \"fbe4372b-f6e0-4562-b969-fe5fdeed773a\") " pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a182a959-9bf8-48fe-b024-32a9f697eb23-host\") pod \"node-ca-zdf9s\" (UID: \"a182a959-9bf8-48fe-b024-32a9f697eb23\") " pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-os-release\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.065816 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.065047 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs podName:016f5832-4461-44e1-b03e-5ca0dc88515d nodeName:}" failed. No retries permitted until 2026-04-20 12:13:56.565020024 +0000 UTC m=+2.117244198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs") pod "network-metrics-daemon-jnnsm" (UID: "016f5832-4461-44e1-b03e-5ca0dc88515d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a182a959-9bf8-48fe-b024-32a9f697eb23-host\") pod \"node-ca-zdf9s\" (UID: \"a182a959-9bf8-48fe-b024-32a9f697eb23\") " pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d527830-9151-40c6-884f-3c8497f96667-ovn-node-metrics-cert\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065129 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f58cdd5e-92df-4b3b-b634-065a2b1275f5-cni-binary-copy\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065158 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-daemon-config\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-kubernetes\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8psnt\" (UniqueName: \"kubernetes.io/projected/24b836ac-13ec-49aa-be4b-4250c8e79676-kube-api-access-8psnt\") pod \"node-resolver-bj5lp\" (UID: \"24b836ac-13ec-49aa-be4b-4250c8e79676\") " pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-kubernetes\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d752e3f9-624e-42e0-8b33-7285148161c0-iptables-alerter-script\") pod \"iptables-alerter-j2n82\" (UID: \"d752e3f9-624e-42e0-8b33-7285148161c0\") " pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-etc-selinux\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d752e3f9-624e-42e0-8b33-7285148161c0-host-slash\") pod \"iptables-alerter-j2n82\" (UID: \"d752e3f9-624e-42e0-8b33-7285148161c0\") " pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065327 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-lib-modules\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/24b836ac-13ec-49aa-be4b-4250c8e79676-hosts-file\") pod \"node-resolver-bj5lp\" (UID: \"24b836ac-13ec-49aa-be4b-4250c8e79676\") " pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-kubelet\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065375 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d752e3f9-624e-42e0-8b33-7285148161c0-host-slash\") pod \"iptables-alerter-j2n82\" (UID: \"d752e3f9-624e-42e0-8b33-7285148161c0\") " pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-run-ovn-kubernetes\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-cni-netd\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.066613 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72c6c42a-45e7-4a4c-8577-2984a8123380-tmp\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065447 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-lib-modules\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065464 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-etc-selinux\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a182a959-9bf8-48fe-b024-32a9f697eb23-serviceca\") pod \"node-ca-zdf9s\" (UID: \"a182a959-9bf8-48fe-b024-32a9f697eb23\") " pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/24b836ac-13ec-49aa-be4b-4250c8e79676-hosts-file\") pod \"node-resolver-bj5lp\" (UID: \"24b836ac-13ec-49aa-be4b-4250c8e79676\") " pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/272f753b-f685-4425-8290-d42ee3ab9738-cni-binary-copy\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fbe4372b-f6e0-4562-b969-fe5fdeed773a-konnectivity-ca\") pod \"konnectivity-agent-vwq8z\" (UID: \"fbe4372b-f6e0-4562-b969-fe5fdeed773a\") " pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/272f753b-f685-4425-8290-d42ee3ab9738-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-run-multus-certs\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fbe4372b-f6e0-4562-b969-fe5fdeed773a-agent-certs\") pod \"konnectivity-agent-vwq8z\" (UID: \"fbe4372b-f6e0-4562-b969-fe5fdeed773a\") " pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-tuned\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d752e3f9-624e-42e0-8b33-7285148161c0-iptables-alerter-script\") pod \"iptables-alerter-j2n82\" (UID: \"d752e3f9-624e-42e0-8b33-7285148161c0\") " pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065869 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-socket-dir-parent\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-run-systemd\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-systemd\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.067529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-registration-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a182a959-9bf8-48fe-b024-32a9f697eb23-serviceca\") pod \"node-ca-zdf9s\" (UID: \"a182a959-9bf8-48fe-b024-32a9f697eb23\") " pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.065793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/272f753b-f685-4425-8290-d42ee3ab9738-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.066030 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.066163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-run-netns\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.066175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/272f753b-f685-4425-8290-d42ee3ab9738-cni-binary-copy\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.066191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-run-ovn\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.066248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d527830-9151-40c6-884f-3c8497f96667-ovnkube-config\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.066304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a60f63c2-4c9f-486c-b55b-1436aae015dc-registration-dir\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.066331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/272f753b-f685-4425-8290-d42ee3ab9738-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.068342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.066383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-systemd\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.069667 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.068946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72c6c42a-45e7-4a4c-8577-2984a8123380-tmp\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.069667 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.069285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fbe4372b-f6e0-4562-b969-fe5fdeed773a-agent-certs\") pod \"konnectivity-agent-vwq8z\" (UID: \"fbe4372b-f6e0-4562-b969-fe5fdeed773a\") " pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:13:56.069667 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.069624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72c6c42a-45e7-4a4c-8577-2984a8123380-etc-tuned\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.075712 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.075663 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:13:56.075803 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.075720 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:13:56.075803 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.075734 2577 projected.go:194] Error preparing data for projected volume kube-api-access-t626p for pod openshift-network-diagnostics/network-check-target-gbv8h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:13:56.075902 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.075841 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p podName:93e3b405-9e2d-44f9-8fc2-b7a191baecfe nodeName:}" failed. No retries permitted until 2026-04-20 12:13:56.575807524 +0000 UTC m=+2.128031707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-t626p" (UniqueName: "kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p") pod "network-check-target-gbv8h" (UID: "93e3b405-9e2d-44f9-8fc2-b7a191baecfe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:13:56.078654 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.078580 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8psnt\" (UniqueName: \"kubernetes.io/projected/24b836ac-13ec-49aa-be4b-4250c8e79676-kube-api-access-8psnt\") pod \"node-resolver-bj5lp\" (UID: \"24b836ac-13ec-49aa-be4b-4250c8e79676\") " pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:56.079119 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.079097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2blst\" (UniqueName: \"kubernetes.io/projected/a60f63c2-4c9f-486c-b55b-1436aae015dc-kube-api-access-2blst\") pod \"aws-ebs-csi-driver-node-2tgl2\" (UID: \"a60f63c2-4c9f-486c-b55b-1436aae015dc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.079202 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.079156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2whz\" (UniqueName: \"kubernetes.io/projected/016f5832-4461-44e1-b03e-5ca0dc88515d-kube-api-access-z2whz\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:56.079257 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.079224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjnnp\" (UniqueName: \"kubernetes.io/projected/72c6c42a-45e7-4a4c-8577-2984a8123380-kube-api-access-fjnnp\") pod \"tuned-hcg2g\" (UID: \"72c6c42a-45e7-4a4c-8577-2984a8123380\") " pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.079257 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.079231 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t67b2\" (UniqueName: \"kubernetes.io/projected/d752e3f9-624e-42e0-8b33-7285148161c0-kube-api-access-t67b2\") pod \"iptables-alerter-j2n82\" (UID: \"d752e3f9-624e-42e0-8b33-7285148161c0\") " pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:56.079479 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.079461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvqsv\" (UniqueName: \"kubernetes.io/projected/a182a959-9bf8-48fe-b024-32a9f697eb23-kube-api-access-cvqsv\") pod \"node-ca-zdf9s\" (UID: \"a182a959-9bf8-48fe-b024-32a9f697eb23\") " pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:56.079700 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.079684 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj6nd\" (UniqueName: \"kubernetes.io/projected/272f753b-f685-4425-8290-d42ee3ab9738-kube-api-access-pj6nd\") pod \"multus-additional-cni-plugins-d4m89\" (UID: \"272f753b-f685-4425-8290-d42ee3ab9738\") " pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.126309 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.126279 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec74c72a1ae3da2b3b1eef59bb72e15d.slice/crio-dd1774270baa5df33b92fc733d1e98e2a14f8027038a6a3853675f384f109990 WatchSource:0}: Error finding container dd1774270baa5df33b92fc733d1e98e2a14f8027038a6a3853675f384f109990: Status 404 returned error can't find the container with id dd1774270baa5df33b92fc733d1e98e2a14f8027038a6a3853675f384f109990 Apr 20 12:13:56.133852 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.133835 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:13:56.140446 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.139390 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dfc4ac8bf9ace5a036e90036a3f5792.slice/crio-c0dde32085156ae51344ec95fa40bcb67c03df2ff4a507a1454650bcbb525d68 WatchSource:0}: Error finding container c0dde32085156ae51344ec95fa40bcb67c03df2ff4a507a1454650bcbb525d68: Status 404 returned error can't find the container with id c0dde32085156ae51344ec95fa40bcb67c03df2ff4a507a1454650bcbb525d68 Apr 20 12:13:56.167163 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167135 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-kubelet\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167163 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-run-ovn-kubernetes\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-cni-netd\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-run-multus-certs\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167262 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-run-ovn-kubernetes\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-cni-netd\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-socket-dir-parent\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167292 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-run-multus-certs\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-kubelet\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-run-systemd\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-socket-dir-parent\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-run-systemd\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-run-netns\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-run-netns\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-run-ovn\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d527830-9151-40c6-884f-3c8497f96667-ovnkube-config\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-run-ovn\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-run-openvswitch\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-var-lib-cni-multus\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167528 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-var-lib-kubelet\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-conf-dir\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-var-lib-cni-multus\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-run-openvswitch\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-etc-kubernetes\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-var-lib-kubelet\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167587 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-etc-kubernetes\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.167709 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-conf-dir\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d527830-9151-40c6-884f-3c8497f96667-ovnkube-script-lib\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-etc-openvswitch\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d527830-9151-40c6-884f-3c8497f96667-env-overrides\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc64x\" (UniqueName: \"kubernetes.io/projected/2d527830-9151-40c6-884f-3c8497f96667-kube-api-access-tc64x\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-run-k8s-cni-cncf-io\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-etc-openvswitch\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167772 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-var-lib-cni-bin\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167805 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-var-lib-cni-bin\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-slash\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-run-netns\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-host-run-k8s-cni-cncf-io\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-slash\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-cni-dir\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167900 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-run-netns\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-systemd-units\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-cnibin\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-cni-dir\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-systemd-units\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.167999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d527830-9151-40c6-884f-3c8497f96667-ovnkube-config\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-hostroot\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-cnibin\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn6vb\" (UniqueName: \"kubernetes.io/projected/f58cdd5e-92df-4b3b-b634-065a2b1275f5-kube-api-access-wn6vb\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-hostroot\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-system-cni-dir\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-var-lib-openvswitch\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d527830-9151-40c6-884f-3c8497f96667-ovnkube-script-lib\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-node-log\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d527830-9151-40c6-884f-3c8497f96667-env-overrides\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168146 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-system-cni-dir\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168162 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-log-socket\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-var-lib-openvswitch\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-node-log\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-cni-bin\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-log-socket\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d527830-9151-40c6-884f-3c8497f96667-host-cni-bin\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.168986 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-os-release\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.169483 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d527830-9151-40c6-884f-3c8497f96667-ovn-node-metrics-cert\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.169483 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f58cdd5e-92df-4b3b-b634-065a2b1275f5-cni-binary-copy\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.169483 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f58cdd5e-92df-4b3b-b634-065a2b1275f5-os-release\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.169483 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-daemon-config\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.169483 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.168918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f58cdd5e-92df-4b3b-b634-065a2b1275f5-multus-daemon-config\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.169483 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.169381 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f58cdd5e-92df-4b3b-b634-065a2b1275f5-cni-binary-copy\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.170433 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.170411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d527830-9151-40c6-884f-3c8497f96667-ovn-node-metrics-cert\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.176461 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.176417 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn6vb\" (UniqueName: \"kubernetes.io/projected/f58cdd5e-92df-4b3b-b634-065a2b1275f5-kube-api-access-wn6vb\") pod \"multus-dfjcx\" (UID: \"f58cdd5e-92df-4b3b-b634-065a2b1275f5\") " pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.176727 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.176713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc64x\" (UniqueName: \"kubernetes.io/projected/2d527830-9151-40c6-884f-3c8497f96667-kube-api-access-tc64x\") pod \"ovnkube-node-zmhl5\" (UID: \"2d527830-9151-40c6-884f-3c8497f96667\") " pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.230069 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.230043 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:13:56.256507 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.256484 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:13:56.262725 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.262703 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe4372b_f6e0_4562_b969_fe5fdeed773a.slice/crio-c7afed86baf0e5c1f9f5d30597e83a242e4e602b9be1d207a8324f070407e6a7 WatchSource:0}: Error finding container c7afed86baf0e5c1f9f5d30597e83a242e4e602b9be1d207a8324f070407e6a7: Status 404 returned error can't find the container with id c7afed86baf0e5c1f9f5d30597e83a242e4e602b9be1d207a8324f070407e6a7 Apr 20 12:13:56.280220 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.280201 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" Apr 20 12:13:56.286234 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.286214 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60f63c2_4c9f_486c_b55b_1436aae015dc.slice/crio-b19457f86cdd04e6b1fdee3be26e4be1bd2c49a0a7ba4a0679dc7767be9853ec WatchSource:0}: Error finding container b19457f86cdd04e6b1fdee3be26e4be1bd2c49a0a7ba4a0679dc7767be9853ec: Status 404 returned error can't find the container with id b19457f86cdd04e6b1fdee3be26e4be1bd2c49a0a7ba4a0679dc7767be9853ec Apr 20 12:13:56.287688 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.287671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" Apr 20 12:13:56.292268 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.292251 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bj5lp" Apr 20 12:13:56.293450 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.293431 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72c6c42a_45e7_4a4c_8577_2984a8123380.slice/crio-f57a067ada0f02d4de1d9c16712100a344ceb5d511894a076c5400f095fc83c4 WatchSource:0}: Error finding container f57a067ada0f02d4de1d9c16712100a344ceb5d511894a076c5400f095fc83c4: Status 404 returned error can't find the container with id f57a067ada0f02d4de1d9c16712100a344ceb5d511894a076c5400f095fc83c4 Apr 20 12:13:56.299239 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.299215 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b836ac_13ec_49aa_be4b_4250c8e79676.slice/crio-2a7a231f74fee83cafcbb2e5fd941ecded4e5d614f301c34982ba667ca8fc3de WatchSource:0}: Error finding container 2a7a231f74fee83cafcbb2e5fd941ecded4e5d614f301c34982ba667ca8fc3de: Status 404 returned error can't find the container with id 2a7a231f74fee83cafcbb2e5fd941ecded4e5d614f301c34982ba667ca8fc3de Apr 20 12:13:56.320208 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.320180 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j2n82" Apr 20 12:13:56.325517 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.325497 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd752e3f9_624e_42e0_8b33_7285148161c0.slice/crio-7aee7db13d411a64d477087dd17df810d2095f5d3d51ae1e8b88c6319df61927 WatchSource:0}: Error finding container 7aee7db13d411a64d477087dd17df810d2095f5d3d51ae1e8b88c6319df61927: Status 404 returned error can't find the container with id 7aee7db13d411a64d477087dd17df810d2095f5d3d51ae1e8b88c6319df61927 Apr 20 12:13:56.335879 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.335855 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zdf9s" Apr 20 12:13:56.343043 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.343022 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda182a959_9bf8_48fe_b024_32a9f697eb23.slice/crio-3191379d8848b2eab3caea170f160cc573a05acceacf9bd7a753f73bf8f31938 WatchSource:0}: Error finding container 3191379d8848b2eab3caea170f160cc573a05acceacf9bd7a753f73bf8f31938: Status 404 returned error can't find the container with id 3191379d8848b2eab3caea170f160cc573a05acceacf9bd7a753f73bf8f31938 Apr 20 12:13:56.356202 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.356185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d4m89" Apr 20 12:13:56.363043 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.363022 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod272f753b_f685_4425_8290_d42ee3ab9738.slice/crio-0fc2158f1c1af774bbaf59687b615a0377ad07d9ba3bca889059b50d1a615690 WatchSource:0}: Error finding container 0fc2158f1c1af774bbaf59687b615a0377ad07d9ba3bca889059b50d1a615690: Status 404 returned error can't find the container with id 0fc2158f1c1af774bbaf59687b615a0377ad07d9ba3bca889059b50d1a615690 Apr 20 12:13:56.376198 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.376179 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dfjcx" Apr 20 12:13:56.382092 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.382072 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf58cdd5e_92df_4b3b_b634_065a2b1275f5.slice/crio-698c44b8a3a629e25374ede8e026107b158b5090af77c059f93adf6eb93ebc22 WatchSource:0}: Error finding container 698c44b8a3a629e25374ede8e026107b158b5090af77c059f93adf6eb93ebc22: Status 404 returned error can't find the container with id 698c44b8a3a629e25374ede8e026107b158b5090af77c059f93adf6eb93ebc22 Apr 20 12:13:56.390089 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.390071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:13:56.396028 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:13:56.396001 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d527830_9151_40c6_884f_3c8497f96667.slice/crio-dda10bbc8eeb1f766a9358179c8e911215b613b66f10c656e8ae783d4f81c223 WatchSource:0}: Error finding container dda10bbc8eeb1f766a9358179c8e911215b613b66f10c656e8ae783d4f81c223: Status 404 returned error can't find the container with id dda10bbc8eeb1f766a9358179c8e911215b613b66f10c656e8ae783d4f81c223 Apr 20 12:13:56.572688 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.572581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:56.572858 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.572749 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:13:56.572858 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.572820 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs podName:016f5832-4461-44e1-b03e-5ca0dc88515d nodeName:}" failed. No retries permitted until 2026-04-20 12:13:57.572801907 +0000 UTC m=+3.125026073 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs") pod "network-metrics-daemon-jnnsm" (UID: "016f5832-4461-44e1-b03e-5ca0dc88515d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:13:56.663303 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.663270 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:13:56.674148 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:56.674089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:13:56.674516 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.674496 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:13:56.674653 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.674529 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:13:56.674653 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.674571 2577 projected.go:194] Error preparing data for projected volume kube-api-access-t626p for pod openshift-network-diagnostics/network-check-target-gbv8h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:13:56.674756 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:56.674726 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p podName:93e3b405-9e2d-44f9-8fc2-b7a191baecfe nodeName:}" failed. No retries permitted until 2026-04-20 12:13:57.674676844 +0000 UTC m=+3.226901024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-t626p" (UniqueName: "kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p") pod "network-check-target-gbv8h" (UID: "93e3b405-9e2d-44f9-8fc2-b7a191baecfe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:13:57.029151 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.028805 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 12:08:56 +0000 UTC" deadline="2028-02-03 19:05:30.251722734 +0000 UTC" Apr 20 12:13:57.029151 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.029054 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15702h51m33.222674063s" Apr 20 12:13:57.078333 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.075811 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:57.078333 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:57.075986 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:13:57.084894 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.084802 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" event={"ID":"2d527830-9151-40c6-884f-3c8497f96667","Type":"ContainerStarted","Data":"dda10bbc8eeb1f766a9358179c8e911215b613b66f10c656e8ae783d4f81c223"} Apr 20 12:13:57.088232 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.088208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4m89" event={"ID":"272f753b-f685-4425-8290-d42ee3ab9738","Type":"ContainerStarted","Data":"0fc2158f1c1af774bbaf59687b615a0377ad07d9ba3bca889059b50d1a615690"} Apr 20 12:13:57.090406 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.090358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j2n82" event={"ID":"d752e3f9-624e-42e0-8b33-7285148161c0","Type":"ContainerStarted","Data":"7aee7db13d411a64d477087dd17df810d2095f5d3d51ae1e8b88c6319df61927"} Apr 20 12:13:57.101653 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.101613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vwq8z" event={"ID":"fbe4372b-f6e0-4562-b969-fe5fdeed773a","Type":"ContainerStarted","Data":"c7afed86baf0e5c1f9f5d30597e83a242e4e602b9be1d207a8324f070407e6a7"} Apr 20 12:13:57.118670 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.118495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" event={"ID":"3dfc4ac8bf9ace5a036e90036a3f5792","Type":"ContainerStarted","Data":"c0dde32085156ae51344ec95fa40bcb67c03df2ff4a507a1454650bcbb525d68"} Apr 20 12:13:57.134659 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.134621 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" event={"ID":"ec74c72a1ae3da2b3b1eef59bb72e15d","Type":"ContainerStarted","Data":"dd1774270baa5df33b92fc733d1e98e2a14f8027038a6a3853675f384f109990"} Apr 20 12:13:57.139989 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.139963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dfjcx" event={"ID":"f58cdd5e-92df-4b3b-b634-065a2b1275f5","Type":"ContainerStarted","Data":"698c44b8a3a629e25374ede8e026107b158b5090af77c059f93adf6eb93ebc22"} Apr 20 12:13:57.145246 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.145223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zdf9s" event={"ID":"a182a959-9bf8-48fe-b024-32a9f697eb23","Type":"ContainerStarted","Data":"3191379d8848b2eab3caea170f160cc573a05acceacf9bd7a753f73bf8f31938"} Apr 20 12:13:57.153625 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.153603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bj5lp" event={"ID":"24b836ac-13ec-49aa-be4b-4250c8e79676","Type":"ContainerStarted","Data":"2a7a231f74fee83cafcbb2e5fd941ecded4e5d614f301c34982ba667ca8fc3de"} Apr 20 12:13:57.163119 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.161259 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" event={"ID":"72c6c42a-45e7-4a4c-8577-2984a8123380","Type":"ContainerStarted","Data":"f57a067ada0f02d4de1d9c16712100a344ceb5d511894a076c5400f095fc83c4"} Apr 20 12:13:57.167068 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.167011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" event={"ID":"a60f63c2-4c9f-486c-b55b-1436aae015dc","Type":"ContainerStarted","Data":"b19457f86cdd04e6b1fdee3be26e4be1bd2c49a0a7ba4a0679dc7767be9853ec"} Apr 20 12:13:57.582568 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.582493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:57.582831 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:57.582684 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:13:57.582831 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:57.582745 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs podName:016f5832-4461-44e1-b03e-5ca0dc88515d nodeName:}" failed. No retries permitted until 2026-04-20 12:13:59.582726357 +0000 UTC m=+5.134950538 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs") pod "network-metrics-daemon-jnnsm" (UID: "016f5832-4461-44e1-b03e-5ca0dc88515d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:13:57.683879 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:57.683836 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:13:57.684038 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:57.684028 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:13:57.684113 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:57.684048 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:13:57.684113 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:57.684061 2577 projected.go:194] Error preparing data for projected volume kube-api-access-t626p for pod openshift-network-diagnostics/network-check-target-gbv8h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:13:57.684212 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:57.684116 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p podName:93e3b405-9e2d-44f9-8fc2-b7a191baecfe nodeName:}" failed. No retries permitted until 2026-04-20 12:13:59.684098445 +0000 UTC m=+5.236322624 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-t626p" (UniqueName: "kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p") pod "network-check-target-gbv8h" (UID: "93e3b405-9e2d-44f9-8fc2-b7a191baecfe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:13:58.030295 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:58.030196 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 12:08:56 +0000 UTC" deadline="2028-01-05 18:17:37.627855374 +0000 UTC" Apr 20 12:13:58.030295 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:58.030236 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15006h3m39.597623005s" Apr 20 12:13:58.066702 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:58.066676 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:13:58.066846 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:58.066799 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:13:58.877346 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:58.877317 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 12:13:59.068906 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:59.068849 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:59.069323 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:59.068981 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:13:59.600209 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:59.600172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:13:59.600443 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:59.600421 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:13:59.600516 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:59.600492 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs podName:016f5832-4461-44e1-b03e-5ca0dc88515d nodeName:}" failed. No retries permitted until 2026-04-20 12:14:03.600474307 +0000 UTC m=+9.152698476 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs") pod "network-metrics-daemon-jnnsm" (UID: "016f5832-4461-44e1-b03e-5ca0dc88515d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:13:59.701085 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:13:59.701046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:13:59.701261 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:59.701228 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:13:59.701261 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:59.701255 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:13:59.701378 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:59.701269 2577 projected.go:194] Error preparing data for projected volume kube-api-access-t626p for pod openshift-network-diagnostics/network-check-target-gbv8h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:13:59.701378 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:13:59.701331 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p podName:93e3b405-9e2d-44f9-8fc2-b7a191baecfe nodeName:}" failed. No retries permitted until 2026-04-20 12:14:03.701311812 +0000 UTC m=+9.253535990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-t626p" (UniqueName: "kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p") pod "network-check-target-gbv8h" (UID: "93e3b405-9e2d-44f9-8fc2-b7a191baecfe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:00.067385 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:00.066831 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:00.067385 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:00.066967 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:01.066876 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:01.066330 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:01.066876 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:01.066462 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:02.066524 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:02.066491 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:02.066730 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:02.066608 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:03.066683 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:03.066442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:03.066683 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:03.066553 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:03.633391 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:03.632807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:03.633391 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:03.632940 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:03.633391 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:03.633012 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs podName:016f5832-4461-44e1-b03e-5ca0dc88515d nodeName:}" failed. No retries permitted until 2026-04-20 12:14:11.632992751 +0000 UTC m=+17.185216920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs") pod "network-metrics-daemon-jnnsm" (UID: "016f5832-4461-44e1-b03e-5ca0dc88515d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:03.733653 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:03.733608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:03.733922 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:03.733793 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:03.733922 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:03.733814 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:03.733922 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:03.733823 2577 projected.go:194] Error preparing data for projected volume kube-api-access-t626p for pod openshift-network-diagnostics/network-check-target-gbv8h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:03.733922 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:03.733886 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p podName:93e3b405-9e2d-44f9-8fc2-b7a191baecfe nodeName:}" failed. No retries permitted until 2026-04-20 12:14:11.733866672 +0000 UTC m=+17.286090850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-t626p" (UniqueName: "kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p") pod "network-check-target-gbv8h" (UID: "93e3b405-9e2d-44f9-8fc2-b7a191baecfe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:04.066694 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:04.066571 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:04.067211 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:04.066726 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:05.067352 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:05.067321 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:05.067866 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:05.067428 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:06.066413 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:06.066381 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:06.066593 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:06.066508 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:07.066999 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:07.066968 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:07.067464 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:07.067099 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:08.066656 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:08.066618 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:08.066821 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:08.066757 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:09.066047 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:09.066017 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:09.066476 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:09.066143 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:10.065949 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:10.065915 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:10.066085 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:10.066007 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:11.066313 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.066274 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:11.066773 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.066403 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:11.304943 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.304912 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rkbqg"] Apr 20 12:14:11.339298 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.339236 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.339439 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.339310 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:11.391095 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.391059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/51f2c5d6-8d34-4caf-b764-5fd970fa149b-dbus\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.391254 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.391104 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.391254 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.391186 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/51f2c5d6-8d34-4caf-b764-5fd970fa149b-kubelet-config\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.492539 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.492507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/51f2c5d6-8d34-4caf-b764-5fd970fa149b-dbus\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.492699 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.492557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.492699 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.492588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/51f2c5d6-8d34-4caf-b764-5fd970fa149b-kubelet-config\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.492789 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.492707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/51f2c5d6-8d34-4caf-b764-5fd970fa149b-kubelet-config\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.492789 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.492722 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/51f2c5d6-8d34-4caf-b764-5fd970fa149b-dbus\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.492789 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.492730 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:11.492895 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.492799 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret podName:51f2c5d6-8d34-4caf-b764-5fd970fa149b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:11.992782683 +0000 UTC m=+17.545006866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret") pod "global-pull-secret-syncer-rkbqg" (UID: "51f2c5d6-8d34-4caf-b764-5fd970fa149b") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:11.694363 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.694280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:11.694517 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.694427 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:11.694517 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.694484 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs podName:016f5832-4461-44e1-b03e-5ca0dc88515d nodeName:}" failed. No retries permitted until 2026-04-20 12:14:27.694470728 +0000 UTC m=+33.246694894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs") pod "network-metrics-daemon-jnnsm" (UID: "016f5832-4461-44e1-b03e-5ca0dc88515d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:11.795161 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.795128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:11.795331 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.795315 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:11.795372 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.795344 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:11.795372 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.795358 2577 projected.go:194] Error preparing data for projected volume kube-api-access-t626p for pod openshift-network-diagnostics/network-check-target-gbv8h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:11.795457 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.795426 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p podName:93e3b405-9e2d-44f9-8fc2-b7a191baecfe nodeName:}" failed. No retries permitted until 2026-04-20 12:14:27.795407044 +0000 UTC m=+33.347631217 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-t626p" (UniqueName: "kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p") pod "network-check-target-gbv8h" (UID: "93e3b405-9e2d-44f9-8fc2-b7a191baecfe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:11.996397 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:11.996299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:11.996559 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.996458 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:11.996559 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:11.996529 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret podName:51f2c5d6-8d34-4caf-b764-5fd970fa149b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:12.996511897 +0000 UTC m=+18.548736069 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret") pod "global-pull-secret-syncer-rkbqg" (UID: "51f2c5d6-8d34-4caf-b764-5fd970fa149b") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:12.066748 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:12.066714 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:12.067188 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:12.066837 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:13.003289 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:13.003258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:13.003450 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:13.003362 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:13.003450 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:13.003416 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret podName:51f2c5d6-8d34-4caf-b764-5fd970fa149b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:15.003400039 +0000 UTC m=+20.555624207 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret") pod "global-pull-secret-syncer-rkbqg" (UID: "51f2c5d6-8d34-4caf-b764-5fd970fa149b") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:13.066179 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:13.066152 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:13.066354 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:13.066157 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:13.066354 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:13.066257 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:13.066467 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:13.066362 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:14.065972 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:14.065934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:14.066370 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:14.066048 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:15.020009 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.019986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:15.020099 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:15.020087 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:15.020148 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:15.020134 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret podName:51f2c5d6-8d34-4caf-b764-5fd970fa149b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:19.020120623 +0000 UTC m=+24.572344789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret") pod "global-pull-secret-syncer-rkbqg" (UID: "51f2c5d6-8d34-4caf-b764-5fd970fa149b") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:15.069094 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.069072 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:15.069610 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:15.069579 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:15.069719 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.069172 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:15.069790 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:15.069764 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:15.199190 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.198987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" event={"ID":"ec74c72a1ae3da2b3b1eef59bb72e15d","Type":"ContainerStarted","Data":"016c68236993d90618031c0e585d9651139aa30ac516b8c41060d96ad6c44bed"} Apr 20 12:14:15.201439 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.201357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dfjcx" event={"ID":"f58cdd5e-92df-4b3b-b634-065a2b1275f5","Type":"ContainerStarted","Data":"2d6cfdd5b43a8c1a39e48bc23eca9fda7e93ff32747f2f1a40e1139f410797bd"} Apr 20 12:14:15.203700 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.203676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" event={"ID":"72c6c42a-45e7-4a4c-8577-2984a8123380","Type":"ContainerStarted","Data":"f0d1648884d4f6a3784ab7bc7a3dd25e71b61adc94bfc5a4280a2c1e73614606"} Apr 20 12:14:15.205143 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.205124 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" event={"ID":"2d527830-9151-40c6-884f-3c8497f96667","Type":"ContainerStarted","Data":"98313272e742e2887bb4511ef03b0e81774866e4f1d3e9267812dba7c54daa2d"} Apr 20 12:14:15.236076 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.235986 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hcg2g" podStartSLOduration=1.6313782620000001 podStartE2EDuration="20.235971201s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:13:56.295723475 +0000 UTC m=+1.847947642" lastFinishedPulling="2026-04-20 12:14:14.900316403 +0000 UTC m=+20.452540581" observedRunningTime="2026-04-20 12:14:15.235836534 +0000 UTC m=+20.788060758" watchObservedRunningTime="2026-04-20 12:14:15.235971201 +0000 UTC m=+20.788195388" Apr 20 12:14:15.236735 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.236695 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-55.ec2.internal" podStartSLOduration=20.236684566 podStartE2EDuration="20.236684566s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:14:15.220261212 +0000 UTC m=+20.772485404" watchObservedRunningTime="2026-04-20 12:14:15.236684566 +0000 UTC m=+20.788908824" Apr 20 12:14:15.252219 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:15.252165 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dfjcx" podStartSLOduration=1.532698641 podStartE2EDuration="20.252148054s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:13:56.383535904 +0000 UTC m=+1.935760070" lastFinishedPulling="2026-04-20 12:14:15.102985301 +0000 UTC m=+20.655209483" observedRunningTime="2026-04-20 12:14:15.250879605 +0000 UTC m=+20.803103828" watchObservedRunningTime="2026-04-20 12:14:15.252148054 +0000 UTC m=+20.804372242" Apr 20 12:14:16.066972 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.066723 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:16.067097 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:16.066980 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:16.208322 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.208294 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" event={"ID":"a60f63c2-4c9f-486c-b55b-1436aae015dc","Type":"ContainerStarted","Data":"e4f60fe427ac352cc4be7c7a6ff938ec632868d1ceaeaf48e1a726567faafc72"} Apr 20 12:14:16.210708 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.210680 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" event={"ID":"2d527830-9151-40c6-884f-3c8497f96667","Type":"ContainerStarted","Data":"a9772d555cfcc41b18708b5f09793e3041b85640abfe88266763a9f6dd264065"} Apr 20 12:14:16.210708 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.210709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" event={"ID":"2d527830-9151-40c6-884f-3c8497f96667","Type":"ContainerStarted","Data":"ee70e360c9f1e9cc1d0e7c6f98ae79d38c4a3e02bf86c2d425aa70a40fb735f8"} Apr 20 12:14:16.210824 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.210723 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" event={"ID":"2d527830-9151-40c6-884f-3c8497f96667","Type":"ContainerStarted","Data":"645a3852fbff01c5a9549789e1bc9c8741a2b594017e4fdbe9ac845d2d35b01f"} Apr 20 12:14:16.210824 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.210737 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" event={"ID":"2d527830-9151-40c6-884f-3c8497f96667","Type":"ContainerStarted","Data":"334b5f08fe7861e8d1fffcee924b8b0e88f6396c8a8f34af5c0c73babb92d271"} Apr 20 12:14:16.210824 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.210749 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" event={"ID":"2d527830-9151-40c6-884f-3c8497f96667","Type":"ContainerStarted","Data":"bfc398041db1573230049c63a788ff499ab864ff6964e66e15b7569663e0cee6"} Apr 20 12:14:16.211915 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.211893 2577 generic.go:358] "Generic (PLEG): container finished" podID="272f753b-f685-4425-8290-d42ee3ab9738" containerID="3ca948b4a6a2c8031090296ac51e824619cf4c73de53e94584d53ae50799698b" exitCode=0 Apr 20 12:14:16.212012 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.211961 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4m89" event={"ID":"272f753b-f685-4425-8290-d42ee3ab9738","Type":"ContainerDied","Data":"3ca948b4a6a2c8031090296ac51e824619cf4c73de53e94584d53ae50799698b"} Apr 20 12:14:16.213603 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.213579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j2n82" event={"ID":"d752e3f9-624e-42e0-8b33-7285148161c0","Type":"ContainerStarted","Data":"5b8bb32100c9541244f1db80e45ecb38f2559b7bd849ac364b5412c49e400caf"} Apr 20 12:14:16.214824 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.214798 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vwq8z" event={"ID":"fbe4372b-f6e0-4562-b969-fe5fdeed773a","Type":"ContainerStarted","Data":"c9c2ee0bf8721d0e89a45853b962e5552a43ad41fafa4ff0be3a8fbe56ce795f"} Apr 20 12:14:16.216053 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.216032 2577 generic.go:358] "Generic (PLEG): container finished" podID="3dfc4ac8bf9ace5a036e90036a3f5792" containerID="f8f67e709a3004970da21e7175191fd3ae3202ec28c97a002eaea3748310004f" exitCode=0 Apr 20 12:14:16.216130 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.216091 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" event={"ID":"3dfc4ac8bf9ace5a036e90036a3f5792","Type":"ContainerDied","Data":"f8f67e709a3004970da21e7175191fd3ae3202ec28c97a002eaea3748310004f"} Apr 20 12:14:16.217384 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.217360 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zdf9s" event={"ID":"a182a959-9bf8-48fe-b024-32a9f697eb23","Type":"ContainerStarted","Data":"84916b88e858b7ce314a1e37b2c9c9627455add035b74981e9fc068cd7254a22"} Apr 20 12:14:16.219410 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.219390 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bj5lp" event={"ID":"24b836ac-13ec-49aa-be4b-4250c8e79676","Type":"ContainerStarted","Data":"fac53e6c798ffcccca33a7a500ef7e533b96b9181e21a8e988c294b1e8f01ab5"} Apr 20 12:14:16.247143 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.246789 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vwq8z" podStartSLOduration=2.612537919 podStartE2EDuration="21.246776745s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:13:56.26424345 +0000 UTC m=+1.816467616" lastFinishedPulling="2026-04-20 12:14:14.898482263 +0000 UTC m=+20.450706442" observedRunningTime="2026-04-20 12:14:16.246456439 +0000 UTC m=+21.798680640" watchObservedRunningTime="2026-04-20 12:14:16.246776745 +0000 UTC m=+21.799000932" Apr 20 12:14:16.261759 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.261096 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bj5lp" podStartSLOduration=2.662084291 podStartE2EDuration="21.26108612s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:13:56.30108651 +0000 UTC m=+1.853310675" lastFinishedPulling="2026-04-20 12:14:14.900088323 +0000 UTC m=+20.452312504" observedRunningTime="2026-04-20 12:14:16.260659195 +0000 UTC m=+21.812883568" watchObservedRunningTime="2026-04-20 12:14:16.26108612 +0000 UTC m=+21.813310312" Apr 20 12:14:16.275845 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.275810 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zdf9s" podStartSLOduration=2.721336637 podStartE2EDuration="21.275799084s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:13:56.344443326 +0000 UTC m=+1.896667494" lastFinishedPulling="2026-04-20 12:14:14.898905761 +0000 UTC m=+20.451129941" observedRunningTime="2026-04-20 12:14:16.275284946 +0000 UTC m=+21.827509138" watchObservedRunningTime="2026-04-20 12:14:16.275799084 +0000 UTC m=+21.828023272" Apr 20 12:14:16.306615 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.306575 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-j2n82" podStartSLOduration=2.776190193 podStartE2EDuration="21.3065638s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:13:56.327298295 +0000 UTC m=+1.879522461" lastFinishedPulling="2026-04-20 12:14:14.857671902 +0000 UTC m=+20.409896068" observedRunningTime="2026-04-20 12:14:16.306114811 +0000 UTC m=+21.858338998" watchObservedRunningTime="2026-04-20 12:14:16.3065638 +0000 UTC m=+21.858787988" Apr 20 12:14:16.958700 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:16.958668 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 12:14:17.026780 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:17.026668 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T12:14:16.958684085Z","UUID":"870d255d-f07e-4461-9b92-c1284a3624ae","Handler":null,"Name":"","Endpoint":""} Apr 20 12:14:17.029206 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:17.029187 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 12:14:17.029293 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:17.029212 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 12:14:17.069456 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:17.069432 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:17.069560 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:17.069439 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:17.069560 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:17.069530 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:17.069652 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:17.069593 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:17.222045 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:17.221965 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" event={"ID":"3dfc4ac8bf9ace5a036e90036a3f5792","Type":"ContainerStarted","Data":"64fc0b0f86083c05902d5403f6a74b3f145501802d84058cd579b4ad6216bb63"} Apr 20 12:14:17.223718 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:17.223687 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" event={"ID":"a60f63c2-4c9f-486c-b55b-1436aae015dc","Type":"ContainerStarted","Data":"54ad815004805b37d6dd210a36b74ee4552426e7e18afd090418ffd169b01798"} Apr 20 12:14:17.240448 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:17.240400 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-55.ec2.internal" podStartSLOduration=22.240382996 podStartE2EDuration="22.240382996s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:14:17.239013634 +0000 UTC m=+22.791237824" watchObservedRunningTime="2026-04-20 12:14:17.240382996 +0000 UTC m=+22.792607185" Apr 20 12:14:18.066004 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:18.065975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:18.066158 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:18.066094 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:18.227817 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:18.227722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" event={"ID":"a60f63c2-4c9f-486c-b55b-1436aae015dc","Type":"ContainerStarted","Data":"201b45ccd136b6497a8ce09507076137a9a737962407893c2ae9b6ad91825a38"} Apr 20 12:14:18.230855 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:18.230825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" event={"ID":"2d527830-9151-40c6-884f-3c8497f96667","Type":"ContainerStarted","Data":"3f243b09887d0ca85cb847ee588a703bab93059159535a3dd2417a2b838f7a2d"} Apr 20 12:14:18.245941 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:18.245893 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2tgl2" podStartSLOduration=1.673073727 podStartE2EDuration="23.245877894s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:13:56.287785919 +0000 UTC m=+1.840010085" lastFinishedPulling="2026-04-20 12:14:17.860590072 +0000 UTC m=+23.412814252" observedRunningTime="2026-04-20 12:14:18.244358125 +0000 UTC m=+23.796582325" watchObservedRunningTime="2026-04-20 12:14:18.245877894 +0000 UTC m=+23.798102082" Apr 20 12:14:18.715084 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:18.714845 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:14:18.715426 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:18.715409 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:14:19.049431 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:19.049364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:19.049576 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:19.049466 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:19.049576 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:19.049521 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret podName:51f2c5d6-8d34-4caf-b764-5fd970fa149b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:27.0495031 +0000 UTC m=+32.601727273 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret") pod "global-pull-secret-syncer-rkbqg" (UID: "51f2c5d6-8d34-4caf-b764-5fd970fa149b") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:19.069864 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:19.069837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:19.070013 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:19.069846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:19.070013 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:19.069949 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:19.070136 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:19.070016 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:19.233045 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:19.233014 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:14:19.233466 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:19.233325 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vwq8z" Apr 20 12:14:20.066541 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:20.066475 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:20.066798 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:20.066590 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:20.238300 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:20.238268 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" event={"ID":"2d527830-9151-40c6-884f-3c8497f96667","Type":"ContainerStarted","Data":"4d1477742135546fa1861d5bb5c6c1b369164d4177dfe474b749f7e445b32650"} Apr 20 12:14:20.239101 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:20.238566 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:14:20.254010 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:20.253988 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:14:20.267154 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:20.267115 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" podStartSLOduration=6.718602116 podStartE2EDuration="25.267104725s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:13:56.397324162 +0000 UTC m=+1.949548329" lastFinishedPulling="2026-04-20 12:14:14.945826773 +0000 UTC m=+20.498050938" observedRunningTime="2026-04-20 12:14:20.266468883 +0000 UTC m=+25.818693092" watchObservedRunningTime="2026-04-20 12:14:20.267104725 +0000 UTC m=+25.819328912" Apr 20 12:14:20.273136 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:20.273118 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:14:21.069855 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.069618 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:21.069993 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:21.069895 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:21.070210 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.069630 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:21.070310 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:21.070292 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:21.241478 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.241425 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:14:21.258551 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.258523 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:14:21.643693 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.643658 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rkbqg"] Apr 20 12:14:21.643883 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.643778 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:21.643948 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:21.643883 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:21.646227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.646201 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jnnsm"] Apr 20 12:14:21.646353 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.646303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:21.646417 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:21.646397 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:21.646996 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.646967 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gbv8h"] Apr 20 12:14:21.647106 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:21.647074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:21.647176 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:21.647152 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:23.065977 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:23.065946 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:23.065977 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:23.065966 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:23.066393 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:23.066052 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:23.066393 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:23.066217 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:23.247959 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:23.247919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4m89" event={"ID":"272f753b-f685-4425-8290-d42ee3ab9738","Type":"ContainerStarted","Data":"7ddde7026919682846dbe6bf6eca5ff286288c5616f91584d5447d91f74fbaa0"} Apr 20 12:14:24.066453 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:24.066390 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:24.066811 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:24.066495 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:24.251980 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:24.251947 2577 generic.go:358] "Generic (PLEG): container finished" podID="272f753b-f685-4425-8290-d42ee3ab9738" containerID="7ddde7026919682846dbe6bf6eca5ff286288c5616f91584d5447d91f74fbaa0" exitCode=0 Apr 20 12:14:24.252322 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:24.252032 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4m89" event={"ID":"272f753b-f685-4425-8290-d42ee3ab9738","Type":"ContainerDied","Data":"7ddde7026919682846dbe6bf6eca5ff286288c5616f91584d5447d91f74fbaa0"} Apr 20 12:14:25.066835 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:25.066817 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:25.067158 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:25.066904 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:25.067158 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:25.066945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:25.067158 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:25.067001 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:25.255522 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:25.255448 2577 generic.go:358] "Generic (PLEG): container finished" podID="272f753b-f685-4425-8290-d42ee3ab9738" containerID="90906e3666c6b6b38758f1aa512647a50f22e92879e7c51c5f72405c78451827" exitCode=0 Apr 20 12:14:25.255522 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:25.255500 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4m89" event={"ID":"272f753b-f685-4425-8290-d42ee3ab9738","Type":"ContainerDied","Data":"90906e3666c6b6b38758f1aa512647a50f22e92879e7c51c5f72405c78451827"} Apr 20 12:14:26.066788 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:26.066761 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:26.066918 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:26.066883 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gbv8h" podUID="93e3b405-9e2d-44f9-8fc2-b7a191baecfe" Apr 20 12:14:26.260001 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:26.259919 2577 generic.go:358] "Generic (PLEG): container finished" podID="272f753b-f685-4425-8290-d42ee3ab9738" containerID="2bc03ad9b8675cf1b873594be17c077c8d1899c68320c0d32bec799004347df0" exitCode=0 Apr 20 12:14:26.260001 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:26.259985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4m89" event={"ID":"272f753b-f685-4425-8290-d42ee3ab9738","Type":"ContainerDied","Data":"2bc03ad9b8675cf1b873594be17c077c8d1899c68320c0d32bec799004347df0"} Apr 20 12:14:27.066998 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.066966 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:27.067444 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.066966 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:27.067444 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.067097 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkbqg" podUID="51f2c5d6-8d34-4caf-b764-5fd970fa149b" Apr 20 12:14:27.067444 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.067170 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jnnsm" podUID="016f5832-4461-44e1-b03e-5ca0dc88515d" Apr 20 12:14:27.111547 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.111513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:27.111715 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.111633 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:27.111780 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.111715 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret podName:51f2c5d6-8d34-4caf-b764-5fd970fa149b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:43.111696815 +0000 UTC m=+48.663920993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret") pod "global-pull-secret-syncer-rkbqg" (UID: "51f2c5d6-8d34-4caf-b764-5fd970fa149b") : object "kube-system"/"original-pull-secret" not registered Apr 20 12:14:27.717088 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.717056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:27.717256 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.717215 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:27.717308 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.717275 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs podName:016f5832-4461-44e1-b03e-5ca0dc88515d nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.717260258 +0000 UTC m=+65.269484424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs") pod "network-metrics-daemon-jnnsm" (UID: "016f5832-4461-44e1-b03e-5ca0dc88515d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 12:14:27.744593 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.744566 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-55.ec2.internal" event="NodeReady" Apr 20 12:14:27.744763 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.744730 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 12:14:27.780085 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.780017 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b"] Apr 20 12:14:27.783258 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.783238 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4"] Apr 20 12:14:27.783428 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.783412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:27.785972 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.785951 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 12:14:27.786190 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.786174 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 12:14:27.786268 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.786231 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 12:14:27.786421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.786403 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw"] Apr 20 12:14:27.786554 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.786530 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" Apr 20 12:14:27.788971 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.788946 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 12:14:27.789181 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.789166 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-pm7vr\"" Apr 20 12:14:27.789488 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.789467 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b45df776f-whmlc"] Apr 20 12:14:27.789576 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.789535 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 12:14:27.789799 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.789785 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:27.791842 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.791822 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 12:14:27.792105 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.792083 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 12:14:27.792193 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.792161 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 12:14:27.792434 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.792412 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 12:14:27.792535 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.792515 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:27.794359 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.794341 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hxqfn"] Apr 20 12:14:27.795992 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.795973 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 12:14:27.795992 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.795988 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 12:14:27.798772 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.798751 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 12:14:27.799225 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.799207 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2vqb9\"" Apr 20 12:14:27.801971 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.801940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b"] Apr 20 12:14:27.801971 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.801974 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4"] Apr 20 12:14:27.802118 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.801992 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw"] Apr 20 12:14:27.802118 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.802006 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b45df776f-whmlc"] Apr 20 12:14:27.802189 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.802127 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:27.808001 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.807972 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nnqtq"] Apr 20 12:14:27.809862 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.809841 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 12:14:27.811401 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.810666 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 12:14:27.811401 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.810853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 12:14:27.812657 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.812618 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hxqfn"] Apr 20 12:14:27.812834 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.812815 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:27.815256 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.815224 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8c8ph\"" Apr 20 12:14:27.815453 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.815427 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 12:14:27.815936 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.815917 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g4vct\"" Apr 20 12:14:27.816430 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.816412 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 12:14:27.816726 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.816707 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 12:14:27.817395 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.817376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:27.817586 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.817569 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 12:14:27.818088 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.817756 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 12:14:27.818088 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.817775 2577 projected.go:194] Error preparing data for projected volume kube-api-access-t626p for pod openshift-network-diagnostics/network-check-target-gbv8h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:27.818088 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:27.817831 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p podName:93e3b405-9e2d-44f9-8fc2-b7a191baecfe nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.817813139 +0000 UTC m=+65.370037322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-t626p" (UniqueName: "kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p") pod "network-check-target-gbv8h" (UID: "93e3b405-9e2d-44f9-8fc2-b7a191baecfe") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 12:14:27.823353 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.823330 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nnqtq"] Apr 20 12:14:27.918197 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:27.918332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:27.918332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/66a9e150-f2ee-4399-bc62-19efc4f139d1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-57ff95b97c-9xrl4\" (UID: \"66a9e150-f2ee-4399-bc62-19efc4f139d1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" Apr 20 12:14:27.918332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918296 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs8wt\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-kube-api-access-vs8wt\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:27.918332 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbvm5\" (UniqueName: \"kubernetes.io/projected/8c5c05d4-9aee-45b3-989d-dce6f05a92de-kube-api-access-pbvm5\") pod \"klusterlet-addon-workmgr-c8654684d-lh59b\" (UID: \"8c5c05d4-9aee-45b3-989d-dce6f05a92de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:27.918523 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-bound-sa-token\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:27.918523 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7988\" (UniqueName: \"kubernetes.io/projected/37056e79-d3b3-4b8c-954f-232d91e2a9a6-kube-api-access-l7988\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:27.918523 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-hub\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:27.918523 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918514 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-trusted-ca\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:27.918689 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvhvx\" (UniqueName: \"kubernetes.io/projected/66a9e150-f2ee-4399-bc62-19efc4f139d1-kube-api-access-nvhvx\") pod \"managed-serviceaccount-addon-agent-57ff95b97c-9xrl4\" (UID: \"66a9e150-f2ee-4399-bc62-19efc4f139d1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" Apr 20 12:14:27.918689 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8c5c05d4-9aee-45b3-989d-dce6f05a92de-klusterlet-config\") pod \"klusterlet-addon-workmgr-c8654684d-lh59b\" (UID: \"8c5c05d4-9aee-45b3-989d-dce6f05a92de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:27.918689 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918586 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:27.918689 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:27.918689 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918680 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsgbg\" (UniqueName: \"kubernetes.io/projected/be5daddb-3077-4d22-8e15-d75f45ef9c2a-kube-api-access-jsgbg\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:27.918865 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918717 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-image-registry-private-configuration\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:27.918865 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c5c05d4-9aee-45b3-989d-dce6f05a92de-tmp\") pod \"klusterlet-addon-workmgr-c8654684d-lh59b\" (UID: \"8c5c05d4-9aee-45b3-989d-dce6f05a92de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:27.918865 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5daddb-3077-4d22-8e15-d75f45ef9c2a-config-volume\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:27.918865 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-ca\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:27.919005 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be5daddb-3077-4d22-8e15-d75f45ef9c2a-tmp-dir\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:27.919005 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c3717894-6e26-4912-a687-87e36b6785a8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:27.919005 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs7lm\" (UniqueName: \"kubernetes.io/projected/c3717894-6e26-4912-a687-87e36b6785a8-kube-api-access-bs7lm\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:27.919005 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.918971 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:27.919147 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.919014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6d046ee-c2af-433b-9120-a41c0d53be7b-ca-trust-extracted\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:27.919147 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.919035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-certificates\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:27.919147 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:27.919053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-installation-pull-secrets\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.019797 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.019762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:28.019942 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.019803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.019942 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.019912 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:28.020049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.019945 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsgbg\" (UniqueName: \"kubernetes.io/projected/be5daddb-3077-4d22-8e15-d75f45ef9c2a-kube-api-access-jsgbg\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:28.020049 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.019969 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls podName:be5daddb-3077-4d22-8e15-d75f45ef9c2a nodeName:}" failed. No retries permitted until 2026-04-20 12:14:28.519950986 +0000 UTC m=+34.072175152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls") pod "dns-default-hxqfn" (UID: "be5daddb-3077-4d22-8e15-d75f45ef9c2a") : secret "dns-default-metrics-tls" not found Apr 20 12:14:28.020049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-image-registry-private-configuration\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.020049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c5c05d4-9aee-45b3-989d-dce6f05a92de-tmp\") pod \"klusterlet-addon-workmgr-c8654684d-lh59b\" (UID: \"8c5c05d4-9aee-45b3-989d-dce6f05a92de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:28.020049 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5daddb-3077-4d22-8e15-d75f45ef9c2a-config-volume\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:28.020227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-ca\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.020227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be5daddb-3077-4d22-8e15-d75f45ef9c2a-tmp-dir\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:28.020227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020112 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c3717894-6e26-4912-a687-87e36b6785a8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.020227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs7lm\" (UniqueName: \"kubernetes.io/projected/c3717894-6e26-4912-a687-87e36b6785a8-kube-api-access-bs7lm\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.020227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.020227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6d046ee-c2af-433b-9120-a41c0d53be7b-ca-trust-extracted\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.020227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-certificates\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.020227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020223 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-installation-pull-secrets\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/66a9e150-f2ee-4399-bc62-19efc4f139d1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-57ff95b97c-9xrl4\" (UID: \"66a9e150-f2ee-4399-bc62-19efc4f139d1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs8wt\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-kube-api-access-vs8wt\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020323 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbvm5\" (UniqueName: \"kubernetes.io/projected/8c5c05d4-9aee-45b3-989d-dce6f05a92de-kube-api-access-pbvm5\") pod \"klusterlet-addon-workmgr-c8654684d-lh59b\" (UID: \"8c5c05d4-9aee-45b3-989d-dce6f05a92de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-bound-sa-token\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7988\" (UniqueName: \"kubernetes.io/projected/37056e79-d3b3-4b8c-954f-232d91e2a9a6-kube-api-access-l7988\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020392 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-hub\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-trusted-ca\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020424 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvhvx\" (UniqueName: \"kubernetes.io/projected/66a9e150-f2ee-4399-bc62-19efc4f139d1-kube-api-access-nvhvx\") pod \"managed-serviceaccount-addon-agent-57ff95b97c-9xrl4\" (UID: \"66a9e150-f2ee-4399-bc62-19efc4f139d1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c5c05d4-9aee-45b3-989d-dce6f05a92de-tmp\") pod \"klusterlet-addon-workmgr-c8654684d-lh59b\" (UID: \"8c5c05d4-9aee-45b3-989d-dce6f05a92de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:28.020512 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.020441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8c5c05d4-9aee-45b3-989d-dce6f05a92de-klusterlet-config\") pod \"klusterlet-addon-workmgr-c8654684d-lh59b\" (UID: \"8c5c05d4-9aee-45b3-989d-dce6f05a92de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:28.021086 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.021047 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:28.021086 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.021065 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b45df776f-whmlc: secret "image-registry-tls" not found Apr 20 12:14:28.021166 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.021115 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls podName:d6d046ee-c2af-433b-9120-a41c0d53be7b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:28.521097224 +0000 UTC m=+34.073321389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls") pod "image-registry-5b45df776f-whmlc" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b") : secret "image-registry-tls" not found Apr 20 12:14:28.021166 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.021150 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c3717894-6e26-4912-a687-87e36b6785a8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.022464 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.021549 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:28.022464 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.021615 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert podName:37056e79-d3b3-4b8c-954f-232d91e2a9a6 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:28.521599157 +0000 UTC m=+34.073823325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert") pod "ingress-canary-nnqtq" (UID: "37056e79-d3b3-4b8c-954f-232d91e2a9a6") : secret "canary-serving-cert" not found Apr 20 12:14:28.022464 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.021622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5daddb-3077-4d22-8e15-d75f45ef9c2a-config-volume\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:28.022464 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.021895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be5daddb-3077-4d22-8e15-d75f45ef9c2a-tmp-dir\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:28.022464 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.022299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6d046ee-c2af-433b-9120-a41c0d53be7b-ca-trust-extracted\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.022921 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.022895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-certificates\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.023480 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.023454 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-trusted-ca\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.025293 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.025270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-image-registry-private-configuration\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.025402 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.025333 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.025998 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.025971 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/66a9e150-f2ee-4399-bc62-19efc4f139d1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-57ff95b97c-9xrl4\" (UID: \"66a9e150-f2ee-4399-bc62-19efc4f139d1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" Apr 20 12:14:28.026096 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.026012 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.026478 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.026461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8c5c05d4-9aee-45b3-989d-dce6f05a92de-klusterlet-config\") pod \"klusterlet-addon-workmgr-c8654684d-lh59b\" (UID: \"8c5c05d4-9aee-45b3-989d-dce6f05a92de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:28.026565 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.026527 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-installation-pull-secrets\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.027004 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.026979 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-ca\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.027350 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.027332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c3717894-6e26-4912-a687-87e36b6785a8-hub\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.031006 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.030958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsgbg\" (UniqueName: \"kubernetes.io/projected/be5daddb-3077-4d22-8e15-d75f45ef9c2a-kube-api-access-jsgbg\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:28.033421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.033395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbvm5\" (UniqueName: \"kubernetes.io/projected/8c5c05d4-9aee-45b3-989d-dce6f05a92de-kube-api-access-pbvm5\") pod \"klusterlet-addon-workmgr-c8654684d-lh59b\" (UID: \"8c5c05d4-9aee-45b3-989d-dce6f05a92de\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:28.033866 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.033837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs8wt\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-kube-api-access-vs8wt\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.034027 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.034011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7988\" (UniqueName: \"kubernetes.io/projected/37056e79-d3b3-4b8c-954f-232d91e2a9a6-kube-api-access-l7988\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:28.034164 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.034148 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs7lm\" (UniqueName: \"kubernetes.io/projected/c3717894-6e26-4912-a687-87e36b6785a8-kube-api-access-bs7lm\") pod \"cluster-proxy-proxy-agent-86f59879c9-fpxkw\" (UID: \"c3717894-6e26-4912-a687-87e36b6785a8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.034224 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.034207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-bound-sa-token\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.034282 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.034264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvhvx\" (UniqueName: \"kubernetes.io/projected/66a9e150-f2ee-4399-bc62-19efc4f139d1-kube-api-access-nvhvx\") pod \"managed-serviceaccount-addon-agent-57ff95b97c-9xrl4\" (UID: \"66a9e150-f2ee-4399-bc62-19efc4f139d1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" Apr 20 12:14:28.065963 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.065943 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:28.068538 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.068512 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 12:14:28.068946 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.068585 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 12:14:28.068946 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.068594 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p76zv\"" Apr 20 12:14:28.098139 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.098118 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:28.110123 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.110104 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" Apr 20 12:14:28.120861 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.120824 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:14:28.275976 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.275947 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4"] Apr 20 12:14:28.279467 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.279444 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b"] Apr 20 12:14:28.279835 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:14:28.279744 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66a9e150_f2ee_4399_bc62_19efc4f139d1.slice/crio-696985e499219aa716abb4a6ddc822b7638ef596fdf73a5ece8a892363bfc9ac WatchSource:0}: Error finding container 696985e499219aa716abb4a6ddc822b7638ef596fdf73a5ece8a892363bfc9ac: Status 404 returned error can't find the container with id 696985e499219aa716abb4a6ddc822b7638ef596fdf73a5ece8a892363bfc9ac Apr 20 12:14:28.282598 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.282547 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw"] Apr 20 12:14:28.524628 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.524590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:28.524833 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.524709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:28.524833 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:28.524749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:28.524833 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.524763 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:28.524833 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.524831 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls podName:be5daddb-3077-4d22-8e15-d75f45ef9c2a nodeName:}" failed. No retries permitted until 2026-04-20 12:14:29.524816386 +0000 UTC m=+35.077040555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls") pod "dns-default-hxqfn" (UID: "be5daddb-3077-4d22-8e15-d75f45ef9c2a") : secret "dns-default-metrics-tls" not found Apr 20 12:14:28.525047 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.524863 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:28.525047 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.524868 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:28.525047 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.524888 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b45df776f-whmlc: secret "image-registry-tls" not found Apr 20 12:14:28.525047 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.524918 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert podName:37056e79-d3b3-4b8c-954f-232d91e2a9a6 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:29.524900043 +0000 UTC m=+35.077124220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert") pod "ingress-canary-nnqtq" (UID: "37056e79-d3b3-4b8c-954f-232d91e2a9a6") : secret "canary-serving-cert" not found Apr 20 12:14:28.525047 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:28.524933 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls podName:d6d046ee-c2af-433b-9120-a41c0d53be7b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:29.524926421 +0000 UTC m=+35.077150590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls") pod "image-registry-5b45df776f-whmlc" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b") : secret "image-registry-tls" not found Apr 20 12:14:29.068172 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.068124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:29.068346 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.068274 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:29.071038 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.071020 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kwdzh\"" Apr 20 12:14:29.071694 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.071107 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 12:14:29.071694 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.071369 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 12:14:29.273015 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.272972 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" event={"ID":"66a9e150-f2ee-4399-bc62-19efc4f139d1","Type":"ContainerStarted","Data":"696985e499219aa716abb4a6ddc822b7638ef596fdf73a5ece8a892363bfc9ac"} Apr 20 12:14:29.275139 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.275086 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" event={"ID":"c3717894-6e26-4912-a687-87e36b6785a8","Type":"ContainerStarted","Data":"31c2dbcca9e4404bfa1c031e594f2b3ef868b965a4285e6662a9819be686952a"} Apr 20 12:14:29.276340 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.276300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" event={"ID":"8c5c05d4-9aee-45b3-989d-dce6f05a92de","Type":"ContainerStarted","Data":"b3480f8b3106133c7cb6a2f3320772609df7ba5444ec48baf83bfabdb0327068"} Apr 20 12:14:29.533420 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.532569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:29.533420 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.532688 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:29.533420 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:29.532730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:29.533420 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:29.532978 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:29.533420 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:29.533044 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:29.533420 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:29.533069 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b45df776f-whmlc: secret "image-registry-tls" not found Apr 20 12:14:29.533420 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:29.533049 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls podName:be5daddb-3077-4d22-8e15-d75f45ef9c2a nodeName:}" failed. No retries permitted until 2026-04-20 12:14:31.533029232 +0000 UTC m=+37.085253411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls") pod "dns-default-hxqfn" (UID: "be5daddb-3077-4d22-8e15-d75f45ef9c2a") : secret "dns-default-metrics-tls" not found Apr 20 12:14:29.533420 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:29.533146 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls podName:d6d046ee-c2af-433b-9120-a41c0d53be7b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:31.533123898 +0000 UTC m=+37.085348075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls") pod "image-registry-5b45df776f-whmlc" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b") : secret "image-registry-tls" not found Apr 20 12:14:29.534087 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:29.532881 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:29.534087 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:29.534063 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert podName:37056e79-d3b3-4b8c-954f-232d91e2a9a6 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:31.534046005 +0000 UTC m=+37.086270179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert") pod "ingress-canary-nnqtq" (UID: "37056e79-d3b3-4b8c-954f-232d91e2a9a6") : secret "canary-serving-cert" not found Apr 20 12:14:31.551094 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:31.550844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:31.551491 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:31.551092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:31.551491 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:31.551164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:31.551491 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:31.551283 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:31.551491 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:31.551347 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls podName:be5daddb-3077-4d22-8e15-d75f45ef9c2a nodeName:}" failed. No retries permitted until 2026-04-20 12:14:35.551330783 +0000 UTC m=+41.103554949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls") pod "dns-default-hxqfn" (UID: "be5daddb-3077-4d22-8e15-d75f45ef9c2a") : secret "dns-default-metrics-tls" not found Apr 20 12:14:31.551789 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:31.551768 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:31.551833 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:31.551790 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b45df776f-whmlc: secret "image-registry-tls" not found Apr 20 12:14:31.551833 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:31.551827 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls podName:d6d046ee-c2af-433b-9120-a41c0d53be7b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:35.551815216 +0000 UTC m=+41.104039382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls") pod "image-registry-5b45df776f-whmlc" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b") : secret "image-registry-tls" not found Apr 20 12:14:31.551917 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:31.551881 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:31.551917 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:31.551910 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert podName:37056e79-d3b3-4b8c-954f-232d91e2a9a6 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:35.551900139 +0000 UTC m=+41.104124308 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert") pod "ingress-canary-nnqtq" (UID: "37056e79-d3b3-4b8c-954f-232d91e2a9a6") : secret "canary-serving-cert" not found Apr 20 12:14:32.948477 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:32.948438 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx"] Apr 20 12:14:32.968312 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:32.968275 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tsf4n"] Apr 20 12:14:32.968817 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:32.968462 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx" Apr 20 12:14:32.970783 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:32.970761 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-nd66c\"" Apr 20 12:14:32.971924 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:32.971691 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 12:14:32.971924 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:32.971691 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:32.983817 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:32.983794 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-964dd4574-ch7vj"] Apr 20 12:14:32.983997 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:32.983980 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.003488 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.003320 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 12:14:33.003488 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.003387 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 12:14:33.003945 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.003929 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:33.004031 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.003988 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-s8fth\"" Apr 20 12:14:33.005529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.005512 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 12:14:33.013711 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.011824 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx"] Apr 20 12:14:33.013711 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.011865 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tsf4n"] Apr 20 12:14:33.013711 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.011881 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-964dd4574-ch7vj"] Apr 20 12:14:33.013711 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.012064 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.016039 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.016017 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 12:14:33.029142 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.029123 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 12:14:33.029142 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.029136 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 12:14:33.029303 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.029287 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 12:14:33.029508 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.029484 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 12:14:33.029596 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.029500 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9xkrp\"" Apr 20 12:14:33.029596 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.029509 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 12:14:33.029596 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.029522 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 12:14:33.051176 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.051144 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889"] Apr 20 12:14:33.065005 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.064966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw8xj\" (UniqueName: \"kubernetes.io/projected/320a9c81-9dfd-4ada-9ef3-fab78e5c337d-kube-api-access-nw8xj\") pod \"volume-data-source-validator-7c6cbb6c87-vb8dx\" (UID: \"320a9c81-9dfd-4ada-9ef3-fab78e5c337d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx" Apr 20 12:14:33.079787 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.079768 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:33.082316 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.082295 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 12:14:33.082436 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.082419 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 12:14:33.082580 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.082563 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jz8nn\"" Apr 20 12:14:33.082918 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.082295 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:33.083923 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.083902 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv"] Apr 20 12:14:33.109902 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.109877 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889"] Apr 20 12:14:33.109902 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.109898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv"] Apr 20 12:14:33.110026 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.109991 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.112619 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.112598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 12:14:33.112722 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.112684 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fbd5q\"" Apr 20 12:14:33.112722 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.112710 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 12:14:33.113202 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.113184 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 12:14:33.115154 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.115136 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 12:14:33.152476 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.152443 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr"] Apr 20 12:14:33.165455 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.165426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbg52\" (UniqueName: \"kubernetes.io/projected/f7c3b32b-9b04-43fd-b10e-f895344efb6a-kube-api-access-cbg52\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.165575 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.165487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7c3b32b-9b04-43fd-b10e-f895344efb6a-trusted-ca\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.165575 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.165566 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:33.165715 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.165688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c3b32b-9b04-43fd-b10e-f895344efb6a-serving-cert\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.165766 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.165728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjb47\" (UniqueName: \"kubernetes.io/projected/057fb086-1c37-429a-8d5b-48a1306a3deb-kube-api-access-cjb47\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.165822 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.165806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c3b32b-9b04-43fd-b10e-f895344efb6a-config\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.165865 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.165856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw8xj\" (UniqueName: \"kubernetes.io/projected/320a9c81-9dfd-4ada-9ef3-fab78e5c337d-kube-api-access-nw8xj\") pod \"volume-data-source-validator-7c6cbb6c87-vb8dx\" (UID: \"320a9c81-9dfd-4ada-9ef3-fab78e5c337d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx" Apr 20 12:14:33.165947 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.165931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.165994 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.165966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.166039 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.166006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw9s6\" (UniqueName: \"kubernetes.io/projected/f923f5f1-db46-4d28-810d-3ed65437dba9-kube-api-access-gw9s6\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:33.166088 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.166034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-stats-auth\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.166122 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.166087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-default-certificate\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.174767 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.174735 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq"] Apr 20 12:14:33.174936 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.174918 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr" Apr 20 12:14:33.177832 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.177634 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-8w6wk\"" Apr 20 12:14:33.190302 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.190277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw8xj\" (UniqueName: \"kubernetes.io/projected/320a9c81-9dfd-4ada-9ef3-fab78e5c337d-kube-api-access-nw8xj\") pod \"volume-data-source-validator-7c6cbb6c87-vb8dx\" (UID: \"320a9c81-9dfd-4ada-9ef3-fab78e5c337d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx" Apr 20 12:14:33.194392 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.194368 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-t44j5"] Apr 20 12:14:33.194563 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.194545 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.197210 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.197173 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:33.197320 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.197218 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-r4tts\"" Apr 20 12:14:33.197390 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.197372 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 12:14:33.197737 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.197721 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 12:14:33.197966 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.197873 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 12:14:33.219466 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.219363 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p"] Apr 20 12:14:33.219590 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.219521 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:33.222391 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.222369 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qc95s\"" Apr 20 12:14:33.222622 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.222603 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 12:14:33.222733 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.222610 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 12:14:33.240469 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.240445 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hlmfm"] Apr 20 12:14:33.240650 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.240619 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.243545 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.243528 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 12:14:33.243781 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.243766 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 12:14:33.244041 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.244018 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 12:14:33.244185 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.244172 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 12:14:33.244383 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.244370 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-lzlsk\"" Apr 20 12:14:33.262081 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.262056 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr"] Apr 20 12:14:33.262186 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.262089 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq"] Apr 20 12:14:33.262186 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.262101 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p"] Apr 20 12:14:33.262186 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.262114 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-t44j5"] Apr 20 12:14:33.262186 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.262125 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hlmfm"] Apr 20 12:14:33.262308 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.262197 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.264584 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.264562 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 12:14:33.264716 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.264617 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 12:14:33.264716 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.264705 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 12:14:33.265061 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.265045 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 12:14:33.265362 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.265156 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-rf68r\"" Apr 20 12:14:33.266484 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b8zg\" (UniqueName: \"kubernetes.io/projected/fa839e80-dc90-4cc7-9ee9-2520a9717383-kube-api-access-6b8zg\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.266574 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gw9s6\" (UniqueName: \"kubernetes.io/projected/f923f5f1-db46-4d28-810d-3ed65437dba9-kube-api-access-gw9s6\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:33.266574 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-stats-auth\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.266719 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-default-certificate\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.266719 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbg52\" (UniqueName: \"kubernetes.io/projected/f7c3b32b-9b04-43fd-b10e-f895344efb6a-kube-api-access-cbg52\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.266719 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7c3b32b-9b04-43fd-b10e-f895344efb6a-trusted-ca\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.266867 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266756 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:33.266867 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266809 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn97d\" (UniqueName: \"kubernetes.io/projected/1d2db534-7694-4bb0-bf28-bfdd20993c08-kube-api-access-mn97d\") pod \"network-check-source-8894fc9bd-48fqr\" (UID: \"1d2db534-7694-4bb0-bf28-bfdd20993c08\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr" Apr 20 12:14:33.266867 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.266856 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 12:14:33.267012 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c3b32b-9b04-43fd-b10e-f895344efb6a-serving-cert\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.267012 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.266929 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls podName:f923f5f1-db46-4d28-810d-3ed65437dba9 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:33.766903835 +0000 UTC m=+39.319128004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sb889" (UID: "f923f5f1-db46-4d28-810d-3ed65437dba9") : secret "samples-operator-tls" not found Apr 20 12:14:33.267012 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.266947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjb47\" (UniqueName: \"kubernetes.io/projected/057fb086-1c37-429a-8d5b-48a1306a3deb-kube-api-access-cjb47\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.267157 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.267013 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c3b32b-9b04-43fd-b10e-f895344efb6a-config\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.267157 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.267063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.267157 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.267096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.267157 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.267127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.267343 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.267165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fa839e80-dc90-4cc7-9ee9-2520a9717383-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.267819 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.267799 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:33.767783715 +0000 UTC m=+39.320007894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : configmap references non-existent config key: service-ca.crt Apr 20 12:14:33.268315 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.268170 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c3b32b-9b04-43fd-b10e-f895344efb6a-config\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.268315 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.268271 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 12:14:33.268668 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.268595 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:33.768578112 +0000 UTC m=+39.320802284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : secret "router-metrics-certs-default" not found Apr 20 12:14:33.270633 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.270400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c3b32b-9b04-43fd-b10e-f895344efb6a-serving-cert\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.270734 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.270662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-default-certificate\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.270784 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.270742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-stats-auth\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.273175 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.273153 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 12:14:33.276973 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.276953 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjb47\" (UniqueName: \"kubernetes.io/projected/057fb086-1c37-429a-8d5b-48a1306a3deb-kube-api-access-cjb47\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.277068 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.277014 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbg52\" (UniqueName: \"kubernetes.io/projected/f7c3b32b-9b04-43fd-b10e-f895344efb6a-kube-api-access-cbg52\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.277822 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.277796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw9s6\" (UniqueName: \"kubernetes.io/projected/f923f5f1-db46-4d28-810d-3ed65437dba9-kube-api-access-gw9s6\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:33.280068 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.280049 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx" Apr 20 12:14:33.280529 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.280508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f7c3b32b-9b04-43fd-b10e-f895344efb6a-trusted-ca\") pod \"console-operator-9d4b6777b-tsf4n\" (UID: \"f7c3b32b-9b04-43fd-b10e-f895344efb6a\") " pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.293959 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.293940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:33.368521 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-service-ca-bundle\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.368690 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b845784c-6d33-44f1-8015-6ea907093662-serving-cert\") pod \"service-ca-operator-d6fc45fc5-x6nvq\" (UID: \"b845784c-6d33-44f1-8015-6ea907093662\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.368690 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b845784c-6d33-44f1-8015-6ea907093662-config\") pod \"service-ca-operator-d6fc45fc5-x6nvq\" (UID: \"b845784c-6d33-44f1-8015-6ea907093662\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.368690 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9d28bf-8c80-47b4-8d39-a66df9464d5b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gnj2p\" (UID: \"df9d28bf-8c80-47b4-8d39-a66df9464d5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.368831 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn97d\" (UniqueName: \"kubernetes.io/projected/1d2db534-7694-4bb0-bf28-bfdd20993c08-kube-api-access-mn97d\") pod \"network-check-source-8894fc9bd-48fqr\" (UID: \"1d2db534-7694-4bb0-bf28-bfdd20993c08\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr" Apr 20 12:14:33.368831 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368721 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-tmp\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.368831 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df9d28bf-8c80-47b4-8d39-a66df9464d5b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gnj2p\" (UID: \"df9d28bf-8c80-47b4-8d39-a66df9464d5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.368831 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q46cb\" (UniqueName: \"kubernetes.io/projected/df9d28bf-8c80-47b4-8d39-a66df9464d5b-kube-api-access-q46cb\") pod \"kube-storage-version-migrator-operator-6769c5d45-gnj2p\" (UID: \"df9d28bf-8c80-47b4-8d39-a66df9464d5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.369016 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368833 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c3302a13-45f4-425e-b2d2-ff221a9e7b91-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:33.369016 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368875 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-serving-cert\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.369016 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.369016 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:33.369016 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.368981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-snapshots\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.369245 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.369030 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:33.369245 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.369100 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls podName:fa839e80-dc90-4cc7-9ee9-2520a9717383 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:33.869079433 +0000 UTC m=+39.421303602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cg6bv" (UID: "fa839e80-dc90-4cc7-9ee9-2520a9717383") : secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:33.369245 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.369032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.369245 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.369169 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q726\" (UniqueName: \"kubernetes.io/projected/b845784c-6d33-44f1-8015-6ea907093662-kube-api-access-7q726\") pod \"service-ca-operator-d6fc45fc5-x6nvq\" (UID: \"b845784c-6d33-44f1-8015-6ea907093662\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.369245 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.369206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fa839e80-dc90-4cc7-9ee9-2520a9717383-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.369487 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.369244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b8zg\" (UniqueName: \"kubernetes.io/projected/fa839e80-dc90-4cc7-9ee9-2520a9717383-kube-api-access-6b8zg\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.369487 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.369290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhtmt\" (UniqueName: \"kubernetes.io/projected/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-kube-api-access-zhtmt\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.369893 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.369875 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fa839e80-dc90-4cc7-9ee9-2520a9717383-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.382304 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.382280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn97d\" (UniqueName: \"kubernetes.io/projected/1d2db534-7694-4bb0-bf28-bfdd20993c08-kube-api-access-mn97d\") pod \"network-check-source-8894fc9bd-48fqr\" (UID: \"1d2db534-7694-4bb0-bf28-bfdd20993c08\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr" Apr 20 12:14:33.385345 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.385321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b8zg\" (UniqueName: \"kubernetes.io/projected/fa839e80-dc90-4cc7-9ee9-2520a9717383-kube-api-access-6b8zg\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.469940 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.469850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q726\" (UniqueName: \"kubernetes.io/projected/b845784c-6d33-44f1-8015-6ea907093662-kube-api-access-7q726\") pod \"service-ca-operator-d6fc45fc5-x6nvq\" (UID: \"b845784c-6d33-44f1-8015-6ea907093662\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.469940 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.469912 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhtmt\" (UniqueName: \"kubernetes.io/projected/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-kube-api-access-zhtmt\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.470133 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.469949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-service-ca-bundle\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.470133 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.469987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b845784c-6d33-44f1-8015-6ea907093662-serving-cert\") pod \"service-ca-operator-d6fc45fc5-x6nvq\" (UID: \"b845784c-6d33-44f1-8015-6ea907093662\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.470133 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b845784c-6d33-44f1-8015-6ea907093662-config\") pod \"service-ca-operator-d6fc45fc5-x6nvq\" (UID: \"b845784c-6d33-44f1-8015-6ea907093662\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.470133 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9d28bf-8c80-47b4-8d39-a66df9464d5b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gnj2p\" (UID: \"df9d28bf-8c80-47b4-8d39-a66df9464d5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.470133 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-tmp\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.470335 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df9d28bf-8c80-47b4-8d39-a66df9464d5b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gnj2p\" (UID: \"df9d28bf-8c80-47b4-8d39-a66df9464d5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.470335 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q46cb\" (UniqueName: \"kubernetes.io/projected/df9d28bf-8c80-47b4-8d39-a66df9464d5b-kube-api-access-q46cb\") pod \"kube-storage-version-migrator-operator-6769c5d45-gnj2p\" (UID: \"df9d28bf-8c80-47b4-8d39-a66df9464d5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.470335 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c3302a13-45f4-425e-b2d2-ff221a9e7b91-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:33.470712 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b845784c-6d33-44f1-8015-6ea907093662-config\") pod \"service-ca-operator-d6fc45fc5-x6nvq\" (UID: \"b845784c-6d33-44f1-8015-6ea907093662\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.470803 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-serving-cert\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.470856 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:33.470911 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-snapshots\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.470911 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.470879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.471072 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.471049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df9d28bf-8c80-47b4-8d39-a66df9464d5b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gnj2p\" (UID: \"df9d28bf-8c80-47b4-8d39-a66df9464d5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.471369 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.471337 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-tmp\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.471735 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.471711 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.471858 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.471835 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 12:14:33.471979 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.471902 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert podName:c3302a13-45f4-425e-b2d2-ff221a9e7b91 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:33.971883399 +0000 UTC m=+39.524107569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t44j5" (UID: "c3302a13-45f4-425e-b2d2-ff221a9e7b91") : secret "networking-console-plugin-cert" not found Apr 20 12:14:33.472601 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.472546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-snapshots\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.472929 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.472859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-service-ca-bundle\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.474227 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.474208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df9d28bf-8c80-47b4-8d39-a66df9464d5b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gnj2p\" (UID: \"df9d28bf-8c80-47b4-8d39-a66df9464d5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.474342 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.474315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b845784c-6d33-44f1-8015-6ea907093662-serving-cert\") pod \"service-ca-operator-d6fc45fc5-x6nvq\" (UID: \"b845784c-6d33-44f1-8015-6ea907093662\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.474436 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.474371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-serving-cert\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.478159 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.478132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q726\" (UniqueName: \"kubernetes.io/projected/b845784c-6d33-44f1-8015-6ea907093662-kube-api-access-7q726\") pod \"service-ca-operator-d6fc45fc5-x6nvq\" (UID: \"b845784c-6d33-44f1-8015-6ea907093662\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.480146 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.480124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhtmt\" (UniqueName: \"kubernetes.io/projected/390328d7-a7ce-4e5b-bb2b-853e4f3b21d7-kube-api-access-zhtmt\") pod \"insights-operator-585dfdc468-hlmfm\" (UID: \"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7\") " pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.480232 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.480151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q46cb\" (UniqueName: \"kubernetes.io/projected/df9d28bf-8c80-47b4-8d39-a66df9464d5b-kube-api-access-q46cb\") pod \"kube-storage-version-migrator-operator-6769c5d45-gnj2p\" (UID: \"df9d28bf-8c80-47b4-8d39-a66df9464d5b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.482228 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.482208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c3302a13-45f4-425e-b2d2-ff221a9e7b91-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:33.486131 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.486111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr" Apr 20 12:14:33.505085 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.505064 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" Apr 20 12:14:33.551395 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.551369 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" Apr 20 12:14:33.590402 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.590372 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-hlmfm" Apr 20 12:14:33.774077 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.773374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.774077 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.773436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:33.774077 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.773535 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:34.773513165 +0000 UTC m=+40.325737349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : configmap references non-existent config key: service-ca.crt Apr 20 12:14:33.774077 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.773584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:33.774077 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.773597 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 12:14:33.774077 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.773672 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:34.773655032 +0000 UTC m=+40.325879216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : secret "router-metrics-certs-default" not found Apr 20 12:14:33.774077 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.773704 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 12:14:33.774077 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.773741 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls podName:f923f5f1-db46-4d28-810d-3ed65437dba9 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:34.773727309 +0000 UTC m=+40.325951475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sb889" (UID: "f923f5f1-db46-4d28-810d-3ed65437dba9") : secret "samples-operator-tls" not found Apr 20 12:14:33.874129 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.874094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:33.874278 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.874234 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:33.874322 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.874292 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls podName:fa839e80-dc90-4cc7-9ee9-2520a9717383 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:34.874279187 +0000 UTC m=+40.426503352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cg6bv" (UID: "fa839e80-dc90-4cc7-9ee9-2520a9717383") : secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:33.974751 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:33.974714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:33.975236 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.974868 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 12:14:33.975236 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:33.974946 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert podName:c3302a13-45f4-425e-b2d2-ff221a9e7b91 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:34.974924466 +0000 UTC m=+40.527148633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t44j5" (UID: "c3302a13-45f4-425e-b2d2-ff221a9e7b91") : secret "networking-console-plugin-cert" not found Apr 20 12:14:34.781982 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:34.781943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:34.782154 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:34.782005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:34.782154 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:34.782093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:34.782265 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:34.782163 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:36.782138647 +0000 UTC m=+42.334362830 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : configmap references non-existent config key: service-ca.crt Apr 20 12:14:34.782265 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:34.782184 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 12:14:34.782265 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:34.782199 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 12:14:34.782265 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:34.782243 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls podName:f923f5f1-db46-4d28-810d-3ed65437dba9 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:36.782227284 +0000 UTC m=+42.334451450 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sb889" (UID: "f923f5f1-db46-4d28-810d-3ed65437dba9") : secret "samples-operator-tls" not found Apr 20 12:14:34.782265 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:34.782263 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:36.782251301 +0000 UTC m=+42.334475471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : secret "router-metrics-certs-default" not found Apr 20 12:14:34.882717 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:34.882682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:34.882897 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:34.882843 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:34.882964 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:34.882920 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls podName:fa839e80-dc90-4cc7-9ee9-2520a9717383 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:36.882898407 +0000 UTC m=+42.435122582 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cg6bv" (UID: "fa839e80-dc90-4cc7-9ee9-2520a9717383") : secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:34.983545 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:34.983514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:34.983929 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:34.983718 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 12:14:34.983985 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:34.983791 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert podName:c3302a13-45f4-425e-b2d2-ff221a9e7b91 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:36.98377116 +0000 UTC m=+42.535995330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t44j5" (UID: "c3302a13-45f4-425e-b2d2-ff221a9e7b91") : secret "networking-console-plugin-cert" not found Apr 20 12:14:35.588259 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:35.588217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:35.588445 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:35.588291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:35.588445 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:35.588345 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:35.588445 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:35.588379 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:35.588445 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:35.588434 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:35.588670 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:35.588461 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:35.588670 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:35.588487 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b45df776f-whmlc: secret "image-registry-tls" not found Apr 20 12:14:35.588670 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:35.588471 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert podName:37056e79-d3b3-4b8c-954f-232d91e2a9a6 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:43.58845406 +0000 UTC m=+49.140678225 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert") pod "ingress-canary-nnqtq" (UID: "37056e79-d3b3-4b8c-954f-232d91e2a9a6") : secret "canary-serving-cert" not found Apr 20 12:14:35.588670 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:35.588616 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls podName:be5daddb-3077-4d22-8e15-d75f45ef9c2a nodeName:}" failed. No retries permitted until 2026-04-20 12:14:43.588601115 +0000 UTC m=+49.140825281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls") pod "dns-default-hxqfn" (UID: "be5daddb-3077-4d22-8e15-d75f45ef9c2a") : secret "dns-default-metrics-tls" not found Apr 20 12:14:35.588670 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:35.588658 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls podName:d6d046ee-c2af-433b-9120-a41c0d53be7b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:43.588628535 +0000 UTC m=+49.140852707 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls") pod "image-registry-5b45df776f-whmlc" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b") : secret "image-registry-tls" not found Apr 20 12:14:36.799153 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:36.799119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:36.799585 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:36.799235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:36.799585 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:36.799279 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:36.799585 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:36.799291 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 12:14:36.799585 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:36.799358 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls podName:f923f5f1-db46-4d28-810d-3ed65437dba9 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:40.79933837 +0000 UTC m=+46.351562537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sb889" (UID: "f923f5f1-db46-4d28-810d-3ed65437dba9") : secret "samples-operator-tls" not found Apr 20 12:14:36.799585 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:36.799381 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 12:14:36.799585 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:36.799407 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:40.799387546 +0000 UTC m=+46.351611731 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : configmap references non-existent config key: service-ca.crt Apr 20 12:14:36.799585 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:36.799430 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:40.799423319 +0000 UTC m=+46.351647488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : secret "router-metrics-certs-default" not found Apr 20 12:14:36.900149 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:36.900125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:36.900280 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:36.900263 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:36.900334 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:36.900325 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls podName:fa839e80-dc90-4cc7-9ee9-2520a9717383 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:40.900310385 +0000 UTC m=+46.452534551 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cg6bv" (UID: "fa839e80-dc90-4cc7-9ee9-2520a9717383") : secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:37.001540 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:37.001510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:37.001704 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:37.001681 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 12:14:37.001774 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:37.001765 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert podName:c3302a13-45f4-425e-b2d2-ff221a9e7b91 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:41.001748371 +0000 UTC m=+46.553972537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t44j5" (UID: "c3302a13-45f4-425e-b2d2-ff221a9e7b91") : secret "networking-console-plugin-cert" not found Apr 20 12:14:37.462995 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:37.462946 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr"] Apr 20 12:14:37.477715 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:37.476715 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx"] Apr 20 12:14:37.487792 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:14:37.487541 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d2db534_7694_4bb0_bf28_bfdd20993c08.slice/crio-e9c60756a71a2f0ffd60174c0c13fa69276af2a2190cee2cbae7b9ea3e176ad1 WatchSource:0}: Error finding container e9c60756a71a2f0ffd60174c0c13fa69276af2a2190cee2cbae7b9ea3e176ad1: Status 404 returned error can't find the container with id e9c60756a71a2f0ffd60174c0c13fa69276af2a2190cee2cbae7b9ea3e176ad1 Apr 20 12:14:37.557224 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:37.557173 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-hlmfm"] Apr 20 12:14:37.717480 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:37.717455 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p"] Apr 20 12:14:37.720381 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:37.720315 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq"] Apr 20 12:14:37.722863 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:37.722837 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tsf4n"] Apr 20 12:14:37.733691 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:14:37.733598 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf9d28bf_8c80_47b4_8d39_a66df9464d5b.slice/crio-0c795bc35921a59f883d1b720b391580dc8bd545ec7be6016cea1f7ed06533e1 WatchSource:0}: Error finding container 0c795bc35921a59f883d1b720b391580dc8bd545ec7be6016cea1f7ed06533e1: Status 404 returned error can't find the container with id 0c795bc35921a59f883d1b720b391580dc8bd545ec7be6016cea1f7ed06533e1 Apr 20 12:14:37.734284 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:14:37.734168 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb845784c_6d33_44f1_8015_6ea907093662.slice/crio-21922c810a8802fa2e90c1cd9c2ac6a61b0632a8fd2abcbc4159136108621803 WatchSource:0}: Error finding container 21922c810a8802fa2e90c1cd9c2ac6a61b0632a8fd2abcbc4159136108621803: Status 404 returned error can't find the container with id 21922c810a8802fa2e90c1cd9c2ac6a61b0632a8fd2abcbc4159136108621803 Apr 20 12:14:37.734360 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:14:37.734340 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod390328d7_a7ce_4e5b_bb2b_853e4f3b21d7.slice/crio-d0e77f384afd7c0b2cd5fa9396702b16bc04a9ede306a308c3a9ecfde355e37a WatchSource:0}: Error finding container d0e77f384afd7c0b2cd5fa9396702b16bc04a9ede306a308c3a9ecfde355e37a: Status 404 returned error can't find the container with id d0e77f384afd7c0b2cd5fa9396702b16bc04a9ede306a308c3a9ecfde355e37a Apr 20 12:14:37.735323 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:14:37.735293 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c3b32b_9b04_43fd_b10e_f895344efb6a.slice/crio-6cae565d6dcd56122d43cb55a4fe41a9cdd977cf72df8a7d165492843fbaadcb WatchSource:0}: Error finding container 6cae565d6dcd56122d43cb55a4fe41a9cdd977cf72df8a7d165492843fbaadcb: Status 404 returned error can't find the container with id 6cae565d6dcd56122d43cb55a4fe41a9cdd977cf72df8a7d165492843fbaadcb Apr 20 12:14:38.303499 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.303452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" event={"ID":"c3717894-6e26-4912-a687-87e36b6785a8","Type":"ContainerStarted","Data":"7c9b8085f8db2d3fac9c1fefc261dcdf45337dead63e4154794ca6f26b63d3d8"} Apr 20 12:14:38.304888 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.304852 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" event={"ID":"f7c3b32b-9b04-43fd-b10e-f895344efb6a","Type":"ContainerStarted","Data":"6cae565d6dcd56122d43cb55a4fe41a9cdd977cf72df8a7d165492843fbaadcb"} Apr 20 12:14:38.306600 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.306568 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" event={"ID":"b845784c-6d33-44f1-8015-6ea907093662","Type":"ContainerStarted","Data":"21922c810a8802fa2e90c1cd9c2ac6a61b0632a8fd2abcbc4159136108621803"} Apr 20 12:14:38.308442 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.308418 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" event={"ID":"66a9e150-f2ee-4399-bc62-19efc4f139d1","Type":"ContainerStarted","Data":"ddec1df9ff46033966e505878bfcfb1bbd50c8270725749b2858c70ab34d2dda"} Apr 20 12:14:38.310213 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.310188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" event={"ID":"df9d28bf-8c80-47b4-8d39-a66df9464d5b","Type":"ContainerStarted","Data":"0c795bc35921a59f883d1b720b391580dc8bd545ec7be6016cea1f7ed06533e1"} Apr 20 12:14:38.311668 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.311625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hlmfm" event={"ID":"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7","Type":"ContainerStarted","Data":"d0e77f384afd7c0b2cd5fa9396702b16bc04a9ede306a308c3a9ecfde355e37a"} Apr 20 12:14:38.313661 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.312851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr" event={"ID":"1d2db534-7694-4bb0-bf28-bfdd20993c08","Type":"ContainerStarted","Data":"e9c60756a71a2f0ffd60174c0c13fa69276af2a2190cee2cbae7b9ea3e176ad1"} Apr 20 12:14:38.317878 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.317705 2577 generic.go:358] "Generic (PLEG): container finished" podID="272f753b-f685-4425-8290-d42ee3ab9738" containerID="e7ed45dafca122a9978e0e516600f21c5d401f023951fd114e27ba50127af5ac" exitCode=0 Apr 20 12:14:38.317878 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.317778 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4m89" event={"ID":"272f753b-f685-4425-8290-d42ee3ab9738","Type":"ContainerDied","Data":"e7ed45dafca122a9978e0e516600f21c5d401f023951fd114e27ba50127af5ac"} Apr 20 12:14:38.320605 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.320209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" event={"ID":"8c5c05d4-9aee-45b3-989d-dce6f05a92de","Type":"ContainerStarted","Data":"2ea739e4d79708b62c7179fe325e1ca63caa1ba792be7aafaf476340ea3559da"} Apr 20 12:14:38.320605 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.320423 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:38.321490 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.321465 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx" event={"ID":"320a9c81-9dfd-4ada-9ef3-fab78e5c337d","Type":"ContainerStarted","Data":"9382409e6880e3381f30c98351d28220167f0c3594a0bba1bff975f78f802947"} Apr 20 12:14:38.322716 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.322663 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" Apr 20 12:14:38.326356 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.325723 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-57ff95b97c-9xrl4" podStartSLOduration=10.369730335 podStartE2EDuration="19.325708501s" podCreationTimestamp="2026-04-20 12:14:19 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.285757617 +0000 UTC m=+33.837981784" lastFinishedPulling="2026-04-20 12:14:37.24173577 +0000 UTC m=+42.793959950" observedRunningTime="2026-04-20 12:14:38.32515617 +0000 UTC m=+43.877380383" watchObservedRunningTime="2026-04-20 12:14:38.325708501 +0000 UTC m=+43.877932685" Apr 20 12:14:38.341686 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:38.341631 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c8654684d-lh59b" podStartSLOduration=10.392255611 podStartE2EDuration="19.341622489s" podCreationTimestamp="2026-04-20 12:14:19 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.284153598 +0000 UTC m=+33.836377764" lastFinishedPulling="2026-04-20 12:14:37.233520463 +0000 UTC m=+42.785744642" observedRunningTime="2026-04-20 12:14:38.34147345 +0000 UTC m=+43.893697638" watchObservedRunningTime="2026-04-20 12:14:38.341622489 +0000 UTC m=+43.893846676" Apr 20 12:14:39.330842 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:39.330468 2577 generic.go:358] "Generic (PLEG): container finished" podID="272f753b-f685-4425-8290-d42ee3ab9738" containerID="7ae6ea65fed52b5571ccf3d629ca0d93724790dfd3ce049fa3fd156aca1a8195" exitCode=0 Apr 20 12:14:39.330842 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:39.330593 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4m89" event={"ID":"272f753b-f685-4425-8290-d42ee3ab9738","Type":"ContainerDied","Data":"7ae6ea65fed52b5571ccf3d629ca0d93724790dfd3ce049fa3fd156aca1a8195"} Apr 20 12:14:40.337224 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:40.337141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4m89" event={"ID":"272f753b-f685-4425-8290-d42ee3ab9738","Type":"ContainerStarted","Data":"650c60ba40521cd14e939f460f9be6479bff8ecf8ca3966cc2a519a98dd2b1ee"} Apr 20 12:14:40.367090 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:40.367023 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d4m89" podStartSLOduration=4.529462619 podStartE2EDuration="45.367010892s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:13:56.36429526 +0000 UTC m=+1.916519426" lastFinishedPulling="2026-04-20 12:14:37.20184352 +0000 UTC m=+42.754067699" observedRunningTime="2026-04-20 12:14:40.365170172 +0000 UTC m=+45.917394398" watchObservedRunningTime="2026-04-20 12:14:40.367010892 +0000 UTC m=+45.919235080" Apr 20 12:14:40.844527 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:40.844482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:40.844722 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:40.844544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:40.844722 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:40.844662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:40.844843 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:40.844731 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 12:14:40.844843 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:40.844764 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:48.844745309 +0000 UTC m=+54.396969479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : configmap references non-existent config key: service-ca.crt Apr 20 12:14:40.844843 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:40.844778 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 12:14:40.844843 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:40.844795 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:14:48.84477636 +0000 UTC m=+54.397000533 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : secret "router-metrics-certs-default" not found Apr 20 12:14:40.844843 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:40.844828 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls podName:f923f5f1-db46-4d28-810d-3ed65437dba9 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:48.844815848 +0000 UTC m=+54.397040014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sb889" (UID: "f923f5f1-db46-4d28-810d-3ed65437dba9") : secret "samples-operator-tls" not found Apr 20 12:14:40.945358 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:40.945275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:40.945513 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:40.945410 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:40.945513 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:40.945478 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls podName:fa839e80-dc90-4cc7-9ee9-2520a9717383 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:48.945462717 +0000 UTC m=+54.497686882 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cg6bv" (UID: "fa839e80-dc90-4cc7-9ee9-2520a9717383") : secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:41.046273 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:41.046244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:41.046442 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:41.046392 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 12:14:41.046513 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:41.046458 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert podName:c3302a13-45f4-425e-b2d2-ff221a9e7b91 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:49.046441567 +0000 UTC m=+54.598665732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t44j5" (UID: "c3302a13-45f4-425e-b2d2-ff221a9e7b91") : secret "networking-console-plugin-cert" not found Apr 20 12:14:43.165962 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:43.165922 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:43.172029 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:43.171993 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/51f2c5d6-8d34-4caf-b764-5fd970fa149b-original-pull-secret\") pod \"global-pull-secret-syncer-rkbqg\" (UID: \"51f2c5d6-8d34-4caf-b764-5fd970fa149b\") " pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:43.191837 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:43.191806 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkbqg" Apr 20 12:14:43.670341 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:43.670297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:43.670532 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:43.670358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:43.670532 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:43.670412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:43.670532 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:43.670467 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 12:14:43.670733 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:43.670548 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls podName:be5daddb-3077-4d22-8e15-d75f45ef9c2a nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.670527086 +0000 UTC m=+65.222751253 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls") pod "dns-default-hxqfn" (UID: "be5daddb-3077-4d22-8e15-d75f45ef9c2a") : secret "dns-default-metrics-tls" not found Apr 20 12:14:43.670733 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:43.670553 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 12:14:43.670733 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:43.670556 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 12:14:43.670733 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:43.670576 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5b45df776f-whmlc: secret "image-registry-tls" not found Apr 20 12:14:43.670733 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:43.670614 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert podName:37056e79-d3b3-4b8c-954f-232d91e2a9a6 nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.670599097 +0000 UTC m=+65.222823264 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert") pod "ingress-canary-nnqtq" (UID: "37056e79-d3b3-4b8c-954f-232d91e2a9a6") : secret "canary-serving-cert" not found Apr 20 12:14:43.670733 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:43.670630 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls podName:d6d046ee-c2af-433b-9120-a41c0d53be7b nodeName:}" failed. No retries permitted until 2026-04-20 12:14:59.670622413 +0000 UTC m=+65.222846578 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls") pod "image-registry-5b45df776f-whmlc" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b") : secret "image-registry-tls" not found Apr 20 12:14:45.011909 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.011868 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rkbqg"] Apr 20 12:14:45.020495 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:14:45.020461 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f2c5d6_8d34_4caf_b764_5fd970fa149b.slice/crio-aae168e1fb9dd94e709ec950c24629715951f86e91db4c68b4e51ee7e81b07d8 WatchSource:0}: Error finding container aae168e1fb9dd94e709ec950c24629715951f86e91db4c68b4e51ee7e81b07d8: Status 404 returned error can't find the container with id aae168e1fb9dd94e709ec950c24629715951f86e91db4c68b4e51ee7e81b07d8 Apr 20 12:14:45.350304 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.350276 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx" event={"ID":"320a9c81-9dfd-4ada-9ef3-fab78e5c337d","Type":"ContainerStarted","Data":"8e8284230573a6b115492884a81aa1620c675b58d1e157005b8b83402df2c364"} Apr 20 12:14:45.351978 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.351953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" event={"ID":"c3717894-6e26-4912-a687-87e36b6785a8","Type":"ContainerStarted","Data":"c4fdd7ae9029089698d813e7be243db5835db4a178c4b8212b2ff0894bf000f3"} Apr 20 12:14:45.353262 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.353244 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/0.log" Apr 20 12:14:45.353338 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.353284 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7c3b32b-9b04-43fd-b10e-f895344efb6a" containerID="5fce0044617b4e18d113c8aec3a7a252309145791ea7232376851316b8061dda" exitCode=255 Apr 20 12:14:45.353405 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.353351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" event={"ID":"f7c3b32b-9b04-43fd-b10e-f895344efb6a","Type":"ContainerDied","Data":"5fce0044617b4e18d113c8aec3a7a252309145791ea7232376851316b8061dda"} Apr 20 12:14:45.353518 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.353505 2577 scope.go:117] "RemoveContainer" containerID="5fce0044617b4e18d113c8aec3a7a252309145791ea7232376851316b8061dda" Apr 20 12:14:45.357038 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.357015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" event={"ID":"b845784c-6d33-44f1-8015-6ea907093662","Type":"ContainerStarted","Data":"d2d1c4d1667f9a6d7410deef21bc4f68e71d3b1ca3e1a0e1d9b9fddddc5c009d"} Apr 20 12:14:45.358983 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.358540 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" event={"ID":"df9d28bf-8c80-47b4-8d39-a66df9464d5b","Type":"ContainerStarted","Data":"7d5e254c8785e0bb845082376141ad40ce9187b2ca6a6bfa2593d6cac361ca8c"} Apr 20 12:14:45.360132 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.360114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hlmfm" event={"ID":"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7","Type":"ContainerStarted","Data":"96298c2ee1784a3326952759e70d20539ac45b3fb2d85d3b55095ebe538feb15"} Apr 20 12:14:45.361848 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.361827 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr" event={"ID":"1d2db534-7694-4bb0-bf28-bfdd20993c08","Type":"ContainerStarted","Data":"06eb86909dfca131917c9f3e38d82e7bccd932f877883ec83aa6d054d5bc02f4"} Apr 20 12:14:45.363389 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.363368 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rkbqg" event={"ID":"51f2c5d6-8d34-4caf-b764-5fd970fa149b","Type":"ContainerStarted","Data":"aae168e1fb9dd94e709ec950c24629715951f86e91db4c68b4e51ee7e81b07d8"} Apr 20 12:14:45.367101 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.367055 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vb8dx" podStartSLOduration=6.027030882 podStartE2EDuration="13.367042158s" podCreationTimestamp="2026-04-20 12:14:32 +0000 UTC" firstStartedPulling="2026-04-20 12:14:37.492741475 +0000 UTC m=+43.044965650" lastFinishedPulling="2026-04-20 12:14:44.832752744 +0000 UTC m=+50.384976926" observedRunningTime="2026-04-20 12:14:45.366766959 +0000 UTC m=+50.918991148" watchObservedRunningTime="2026-04-20 12:14:45.367042158 +0000 UTC m=+50.919266347" Apr 20 12:14:45.384363 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.384318 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-48fqr" podStartSLOduration=4.982404043 podStartE2EDuration="12.384304037s" podCreationTimestamp="2026-04-20 12:14:33 +0000 UTC" firstStartedPulling="2026-04-20 12:14:37.491187038 +0000 UTC m=+43.043411218" lastFinishedPulling="2026-04-20 12:14:44.893087028 +0000 UTC m=+50.445311212" observedRunningTime="2026-04-20 12:14:45.383608771 +0000 UTC m=+50.935832956" watchObservedRunningTime="2026-04-20 12:14:45.384304037 +0000 UTC m=+50.936528226" Apr 20 12:14:45.418760 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.418713 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" podStartSLOduration=5.270796121 podStartE2EDuration="12.418698625s" podCreationTimestamp="2026-04-20 12:14:33 +0000 UTC" firstStartedPulling="2026-04-20 12:14:37.736449231 +0000 UTC m=+43.288673401" lastFinishedPulling="2026-04-20 12:14:44.884351736 +0000 UTC m=+50.436575905" observedRunningTime="2026-04-20 12:14:45.417433083 +0000 UTC m=+50.969657273" watchObservedRunningTime="2026-04-20 12:14:45.418698625 +0000 UTC m=+50.970922815" Apr 20 12:14:45.440159 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.440106 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" podStartSLOduration=5.292121431 podStartE2EDuration="12.440089017s" podCreationTimestamp="2026-04-20 12:14:33 +0000 UTC" firstStartedPulling="2026-04-20 12:14:37.73638532 +0000 UTC m=+43.288609490" lastFinishedPulling="2026-04-20 12:14:44.884352896 +0000 UTC m=+50.436577076" observedRunningTime="2026-04-20 12:14:45.438308492 +0000 UTC m=+50.990532692" watchObservedRunningTime="2026-04-20 12:14:45.440089017 +0000 UTC m=+50.992313207" Apr 20 12:14:45.460463 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:45.460409 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-hlmfm" podStartSLOduration=5.313538555 podStartE2EDuration="12.460391572s" podCreationTimestamp="2026-04-20 12:14:33 +0000 UTC" firstStartedPulling="2026-04-20 12:14:37.736428648 +0000 UTC m=+43.288652820" lastFinishedPulling="2026-04-20 12:14:44.883281668 +0000 UTC m=+50.435505837" observedRunningTime="2026-04-20 12:14:45.458947636 +0000 UTC m=+51.011171825" watchObservedRunningTime="2026-04-20 12:14:45.460391572 +0000 UTC m=+51.012615759" Apr 20 12:14:46.369774 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:46.369738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" event={"ID":"c3717894-6e26-4912-a687-87e36b6785a8","Type":"ContainerStarted","Data":"0b8375c1431e511163eeb7a0854734d4721d818858e16b23714f7dd9f2d4f1e0"} Apr 20 12:14:46.371367 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:46.371339 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:14:46.371814 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:46.371790 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/0.log" Apr 20 12:14:46.371888 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:46.371835 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7c3b32b-9b04-43fd-b10e-f895344efb6a" containerID="f17ca09842261ddfe1a88865d95ef5b525cfaa97509dd3c2d51762b04a285147" exitCode=255 Apr 20 12:14:46.371992 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:46.371957 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" event={"ID":"f7c3b32b-9b04-43fd-b10e-f895344efb6a","Type":"ContainerDied","Data":"f17ca09842261ddfe1a88865d95ef5b525cfaa97509dd3c2d51762b04a285147"} Apr 20 12:14:46.372064 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:46.372014 2577 scope.go:117] "RemoveContainer" containerID="5fce0044617b4e18d113c8aec3a7a252309145791ea7232376851316b8061dda" Apr 20 12:14:46.372178 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:46.372162 2577 scope.go:117] "RemoveContainer" containerID="f17ca09842261ddfe1a88865d95ef5b525cfaa97509dd3c2d51762b04a285147" Apr 20 12:14:46.372354 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:46.372335 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tsf4n_openshift-console-operator(f7c3b32b-9b04-43fd-b10e-f895344efb6a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" podUID="f7c3b32b-9b04-43fd-b10e-f895344efb6a" Apr 20 12:14:46.388933 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:46.388893 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" podStartSLOduration=10.484498075 podStartE2EDuration="27.388881694s" podCreationTimestamp="2026-04-20 12:14:19 +0000 UTC" firstStartedPulling="2026-04-20 12:14:28.288317975 +0000 UTC m=+33.840542145" lastFinishedPulling="2026-04-20 12:14:45.192701585 +0000 UTC m=+50.744925764" observedRunningTime="2026-04-20 12:14:46.388200464 +0000 UTC m=+51.940424655" watchObservedRunningTime="2026-04-20 12:14:46.388881694 +0000 UTC m=+51.941105883" Apr 20 12:14:47.376141 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:47.376111 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:14:47.376633 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:47.376561 2577 scope.go:117] "RemoveContainer" containerID="f17ca09842261ddfe1a88865d95ef5b525cfaa97509dd3c2d51762b04a285147" Apr 20 12:14:47.376751 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:47.376734 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tsf4n_openshift-console-operator(f7c3b32b-9b04-43fd-b10e-f895344efb6a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" podUID="f7c3b32b-9b04-43fd-b10e-f895344efb6a" Apr 20 12:14:48.921478 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:48.921445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:48.921977 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:48.921507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:14:48.921977 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:48.921618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:14:48.921977 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:48.921627 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 12:14:48.921977 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:48.921632 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:15:04.921612086 +0000 UTC m=+70.473836270 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : configmap references non-existent config key: service-ca.crt Apr 20 12:14:48.921977 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:48.921715 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs podName:057fb086-1c37-429a-8d5b-48a1306a3deb nodeName:}" failed. No retries permitted until 2026-04-20 12:15:04.921702885 +0000 UTC m=+70.473927052 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs") pod "router-default-964dd4574-ch7vj" (UID: "057fb086-1c37-429a-8d5b-48a1306a3deb") : secret "router-metrics-certs-default" not found Apr 20 12:14:48.921977 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:48.921721 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 12:14:48.921977 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:48.921780 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls podName:f923f5f1-db46-4d28-810d-3ed65437dba9 nodeName:}" failed. No retries permitted until 2026-04-20 12:15:04.921763349 +0000 UTC m=+70.473987517 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-sb889" (UID: "f923f5f1-db46-4d28-810d-3ed65437dba9") : secret "samples-operator-tls" not found Apr 20 12:14:48.942320 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:48.942289 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bj5lp_24b836ac-13ec-49aa-be4b-4250c8e79676/dns-node-resolver/0.log" Apr 20 12:14:49.023368 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:49.023328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:14:49.023550 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:49.023469 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:49.023550 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:49.023535 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls podName:fa839e80-dc90-4cc7-9ee9-2520a9717383 nodeName:}" failed. No retries permitted until 2026-04-20 12:15:05.023519606 +0000 UTC m=+70.575743771 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-cg6bv" (UID: "fa839e80-dc90-4cc7-9ee9-2520a9717383") : secret "cluster-monitoring-operator-tls" not found Apr 20 12:14:49.124676 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:49.124626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:14:49.124848 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:49.124795 2577 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 12:14:49.124908 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:49.124865 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert podName:c3302a13-45f4-425e-b2d2-ff221a9e7b91 nodeName:}" failed. No retries permitted until 2026-04-20 12:15:05.124847405 +0000 UTC m=+70.677071574 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-t44j5" (UID: "c3302a13-45f4-425e-b2d2-ff221a9e7b91") : secret "networking-console-plugin-cert" not found Apr 20 12:14:50.142170 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:50.142144 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zdf9s_a182a959-9bf8-48fe-b024-32a9f697eb23/node-ca/0.log" Apr 20 12:14:50.388438 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:50.388409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rkbqg" event={"ID":"51f2c5d6-8d34-4caf-b764-5fd970fa149b","Type":"ContainerStarted","Data":"c26b73013c32e19a029840152801c063d2f060ef9ad35c02e99914f433ba96f4"} Apr 20 12:14:50.405230 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:50.405148 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rkbqg" podStartSLOduration=34.968718683 podStartE2EDuration="39.405134929s" podCreationTimestamp="2026-04-20 12:14:11 +0000 UTC" firstStartedPulling="2026-04-20 12:14:45.151139216 +0000 UTC m=+50.703363382" lastFinishedPulling="2026-04-20 12:14:49.587555456 +0000 UTC m=+55.139779628" observedRunningTime="2026-04-20 12:14:50.404255091 +0000 UTC m=+55.956479283" watchObservedRunningTime="2026-04-20 12:14:50.405134929 +0000 UTC m=+55.957359117" Apr 20 12:14:51.344478 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:51.344451 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gnj2p_df9d28bf-8c80-47b4-8d39-a66df9464d5b/kube-storage-version-migrator-operator/0.log" Apr 20 12:14:53.259087 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:53.259061 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zmhl5" Apr 20 12:14:53.294122 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:53.294096 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:53.294122 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:53.294127 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:14:53.294411 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:53.294399 2577 scope.go:117] "RemoveContainer" containerID="f17ca09842261ddfe1a88865d95ef5b525cfaa97509dd3c2d51762b04a285147" Apr 20 12:14:53.294561 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:53.294546 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tsf4n_openshift-console-operator(f7c3b32b-9b04-43fd-b10e-f895344efb6a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" podUID="f7c3b32b-9b04-43fd-b10e-f895344efb6a" Apr 20 12:14:59.715215 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.715187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:59.715735 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.715234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:59.715735 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.715263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:59.717790 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.717760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be5daddb-3077-4d22-8e15-d75f45ef9c2a-metrics-tls\") pod \"dns-default-hxqfn\" (UID: \"be5daddb-3077-4d22-8e15-d75f45ef9c2a\") " pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:59.717903 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.717811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"image-registry-5b45df776f-whmlc\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:59.717903 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.717849 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37056e79-d3b3-4b8c-954f-232d91e2a9a6-cert\") pod \"ingress-canary-nnqtq\" (UID: \"37056e79-d3b3-4b8c-954f-232d91e2a9a6\") " pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:14:59.815802 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.815771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:14:59.818569 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.818552 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 12:14:59.826694 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:59.826679 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 12:14:59.826764 ip-10-0-131-55 kubenswrapper[2577]: E0420 12:14:59.826742 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs podName:016f5832-4461-44e1-b03e-5ca0dc88515d nodeName:}" failed. No retries permitted until 2026-04-20 12:16:03.82672679 +0000 UTC m=+129.378950957 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs") pod "network-metrics-daemon-jnnsm" (UID: "016f5832-4461-44e1-b03e-5ca0dc88515d") : secret "metrics-daemon-secret" not found Apr 20 12:14:59.916777 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.916748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:59.919301 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.919277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t626p\" (UniqueName: \"kubernetes.io/projected/93e3b405-9e2d-44f9-8fc2-b7a191baecfe-kube-api-access-t626p\") pod \"network-check-target-gbv8h\" (UID: \"93e3b405-9e2d-44f9-8fc2-b7a191baecfe\") " pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:14:59.927843 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.927826 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2vqb9\"" Apr 20 12:14:59.936062 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.936047 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:14:59.938764 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.938748 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8c8ph\"" Apr 20 12:14:59.945866 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.945848 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g4vct\"" Apr 20 12:14:59.946869 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.946849 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hxqfn" Apr 20 12:14:59.953905 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:14:59.953885 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nnqtq" Apr 20 12:15:00.098367 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.098331 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b45df776f-whmlc"] Apr 20 12:15:00.104136 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:15:00.104109 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6d046ee_c2af_433b_9120_a41c0d53be7b.slice/crio-248da8158de7433ffa63109c1e382003c3d19b1e28f27073ff48d463becc5126 WatchSource:0}: Error finding container 248da8158de7433ffa63109c1e382003c3d19b1e28f27073ff48d463becc5126: Status 404 returned error can't find the container with id 248da8158de7433ffa63109c1e382003c3d19b1e28f27073ff48d463becc5126 Apr 20 12:15:00.179535 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.179516 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p76zv\"" Apr 20 12:15:00.187305 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.187278 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:15:00.307183 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.307151 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gbv8h"] Apr 20 12:15:00.310818 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:15:00.310778 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93e3b405_9e2d_44f9_8fc2_b7a191baecfe.slice/crio-2fc74ed4a6b9260aee0f9913d45095ee49975c715d7c110b2e5e18e7a9fbee87 WatchSource:0}: Error finding container 2fc74ed4a6b9260aee0f9913d45095ee49975c715d7c110b2e5e18e7a9fbee87: Status 404 returned error can't find the container with id 2fc74ed4a6b9260aee0f9913d45095ee49975c715d7c110b2e5e18e7a9fbee87 Apr 20 12:15:00.316573 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.316550 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nnqtq"] Apr 20 12:15:00.319954 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:15:00.319930 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37056e79_d3b3_4b8c_954f_232d91e2a9a6.slice/crio-d0ada19e6ceacf68f3c3be57fc8fe61c928b0f38d8e3104957b57ebc82ce9d2e WatchSource:0}: Error finding container d0ada19e6ceacf68f3c3be57fc8fe61c928b0f38d8e3104957b57ebc82ce9d2e: Status 404 returned error can't find the container with id d0ada19e6ceacf68f3c3be57fc8fe61c928b0f38d8e3104957b57ebc82ce9d2e Apr 20 12:15:00.320955 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.320936 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hxqfn"] Apr 20 12:15:00.322852 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:15:00.322831 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5daddb_3077_4d22_8e15_d75f45ef9c2a.slice/crio-713abcfbc4007d8c57510586524cba6074cf948bfd123093eb272bd4ed6c8b9f WatchSource:0}: Error finding container 713abcfbc4007d8c57510586524cba6074cf948bfd123093eb272bd4ed6c8b9f: Status 404 returned error can't find the container with id 713abcfbc4007d8c57510586524cba6074cf948bfd123093eb272bd4ed6c8b9f Apr 20 12:15:00.414570 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.414542 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hxqfn" event={"ID":"be5daddb-3077-4d22-8e15-d75f45ef9c2a","Type":"ContainerStarted","Data":"713abcfbc4007d8c57510586524cba6074cf948bfd123093eb272bd4ed6c8b9f"} Apr 20 12:15:00.415836 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.415790 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gbv8h" event={"ID":"93e3b405-9e2d-44f9-8fc2-b7a191baecfe","Type":"ContainerStarted","Data":"a4581cd225b6dd5ef90eacbc84acb0e3bc3515d9c3eff66501c1a033cab7a4e9"} Apr 20 12:15:00.415836 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.415830 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gbv8h" event={"ID":"93e3b405-9e2d-44f9-8fc2-b7a191baecfe","Type":"ContainerStarted","Data":"2fc74ed4a6b9260aee0f9913d45095ee49975c715d7c110b2e5e18e7a9fbee87"} Apr 20 12:15:00.416008 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.415949 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:15:00.416879 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.416855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nnqtq" event={"ID":"37056e79-d3b3-4b8c-954f-232d91e2a9a6","Type":"ContainerStarted","Data":"d0ada19e6ceacf68f3c3be57fc8fe61c928b0f38d8e3104957b57ebc82ce9d2e"} Apr 20 12:15:00.418093 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.418071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" event={"ID":"d6d046ee-c2af-433b-9120-a41c0d53be7b","Type":"ContainerStarted","Data":"e783b74e8f49dff9e96393a2f5e01949e4ce82dfbf92f0f7b9c71a9f10d5b1fa"} Apr 20 12:15:00.418183 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.418100 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" event={"ID":"d6d046ee-c2af-433b-9120-a41c0d53be7b","Type":"ContainerStarted","Data":"248da8158de7433ffa63109c1e382003c3d19b1e28f27073ff48d463becc5126"} Apr 20 12:15:00.418248 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.418234 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:15:00.432446 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.432411 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-gbv8h" podStartSLOduration=65.432400075 podStartE2EDuration="1m5.432400075s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:15:00.431666765 +0000 UTC m=+65.983890950" watchObservedRunningTime="2026-04-20 12:15:00.432400075 +0000 UTC m=+65.984624263" Apr 20 12:15:00.450344 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:00.450302 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" podStartSLOduration=65.450291267 podStartE2EDuration="1m5.450291267s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:15:00.449573733 +0000 UTC m=+66.001797932" watchObservedRunningTime="2026-04-20 12:15:00.450291267 +0000 UTC m=+66.002515457" Apr 20 12:15:03.430751 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:03.430720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hxqfn" event={"ID":"be5daddb-3077-4d22-8e15-d75f45ef9c2a","Type":"ContainerStarted","Data":"fcd30fca3fd7c1a0d9b94da8b0452c4e43f3c89f35fd74b1c2b8d190cd0bc1a6"} Apr 20 12:15:03.431237 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:03.430759 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hxqfn" event={"ID":"be5daddb-3077-4d22-8e15-d75f45ef9c2a","Type":"ContainerStarted","Data":"8893525a6ae5964c167e465d45f1b8bc5d446ba1a2422bc463803a523508daac"} Apr 20 12:15:03.431237 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:03.430829 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hxqfn" Apr 20 12:15:03.431979 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:03.431960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nnqtq" event={"ID":"37056e79-d3b3-4b8c-954f-232d91e2a9a6","Type":"ContainerStarted","Data":"841b791876023f22482c0c733dda289673e590a1abab6843d3c298f91442af5d"} Apr 20 12:15:03.452525 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:03.450045 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hxqfn" podStartSLOduration=33.933408354 podStartE2EDuration="36.450028828s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:15:00.324581294 +0000 UTC m=+65.876805460" lastFinishedPulling="2026-04-20 12:15:02.841201763 +0000 UTC m=+68.393425934" observedRunningTime="2026-04-20 12:15:03.44727244 +0000 UTC m=+68.999496639" watchObservedRunningTime="2026-04-20 12:15:03.450028828 +0000 UTC m=+69.002253018" Apr 20 12:15:03.465705 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:03.465624 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nnqtq" podStartSLOduration=33.942046744 podStartE2EDuration="36.465611186s" podCreationTimestamp="2026-04-20 12:14:27 +0000 UTC" firstStartedPulling="2026-04-20 12:15:00.321933797 +0000 UTC m=+65.874157967" lastFinishedPulling="2026-04-20 12:15:02.84549823 +0000 UTC m=+68.397722409" observedRunningTime="2026-04-20 12:15:03.465131318 +0000 UTC m=+69.017355505" watchObservedRunningTime="2026-04-20 12:15:03.465611186 +0000 UTC m=+69.017835374" Apr 20 12:15:04.963445 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:04.963406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:15:04.963445 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:04.963452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:15:04.964135 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:04.963497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:15:04.964230 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:04.964207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/057fb086-1c37-429a-8d5b-48a1306a3deb-service-ca-bundle\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:15:04.966172 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:04.966151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f923f5f1-db46-4d28-810d-3ed65437dba9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-sb889\" (UID: \"f923f5f1-db46-4d28-810d-3ed65437dba9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:15:04.966225 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:04.966151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/057fb086-1c37-429a-8d5b-48a1306a3deb-metrics-certs\") pod \"router-default-964dd4574-ch7vj\" (UID: \"057fb086-1c37-429a-8d5b-48a1306a3deb\") " pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:15:05.064707 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.064674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:15:05.067162 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.067134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa839e80-dc90-4cc7-9ee9-2520a9717383-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-cg6bv\" (UID: \"fa839e80-dc90-4cc7-9ee9-2520a9717383\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:15:05.071018 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.070996 2577 scope.go:117] "RemoveContainer" containerID="f17ca09842261ddfe1a88865d95ef5b525cfaa97509dd3c2d51762b04a285147" Apr 20 12:15:05.124929 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.124906 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9xkrp\"" Apr 20 12:15:05.132959 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.132929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:15:05.165285 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.165248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:15:05.168595 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.168556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c3302a13-45f4-425e-b2d2-ff221a9e7b91-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-t44j5\" (UID: \"c3302a13-45f4-425e-b2d2-ff221a9e7b91\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:15:05.193573 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.193537 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jz8nn\"" Apr 20 12:15:05.202179 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.202147 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" Apr 20 12:15:05.226620 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.226594 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-fbd5q\"" Apr 20 12:15:05.230733 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.230707 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" Apr 20 12:15:05.272539 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:15:05.272505 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod057fb086_1c37_429a_8d5b_48a1306a3deb.slice/crio-646aaec9ede6cba7fd9946a69a061ca3febc8532a74cb4221a9cd8f1a4e4b81e WatchSource:0}: Error finding container 646aaec9ede6cba7fd9946a69a061ca3febc8532a74cb4221a9cd8f1a4e4b81e: Status 404 returned error can't find the container with id 646aaec9ede6cba7fd9946a69a061ca3febc8532a74cb4221a9cd8f1a4e4b81e Apr 20 12:15:05.272539 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.272533 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-964dd4574-ch7vj"] Apr 20 12:15:05.335875 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.335815 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qc95s\"" Apr 20 12:15:05.340449 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.340426 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" Apr 20 12:15:05.347002 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.346951 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889"] Apr 20 12:15:05.398241 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.397786 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv"] Apr 20 12:15:05.401699 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:15:05.401653 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa839e80_dc90_4cc7_9ee9_2520a9717383.slice/crio-bbbcb3e424632dae18d1b736b50d863470dc872b32c34e38022351169288caaa WatchSource:0}: Error finding container bbbcb3e424632dae18d1b736b50d863470dc872b32c34e38022351169288caaa: Status 404 returned error can't find the container with id bbbcb3e424632dae18d1b736b50d863470dc872b32c34e38022351169288caaa Apr 20 12:15:05.442595 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.442558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" event={"ID":"fa839e80-dc90-4cc7-9ee9-2520a9717383","Type":"ContainerStarted","Data":"bbbcb3e424632dae18d1b736b50d863470dc872b32c34e38022351169288caaa"} Apr 20 12:15:05.443631 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.443596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" event={"ID":"f923f5f1-db46-4d28-810d-3ed65437dba9","Type":"ContainerStarted","Data":"5c779e83af090d159074d8826622c1f46dd86beb3b5b520686b23bcff6805c52"} Apr 20 12:15:05.444910 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.444887 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-964dd4574-ch7vj" event={"ID":"057fb086-1c37-429a-8d5b-48a1306a3deb","Type":"ContainerStarted","Data":"7a0158de20d155c27fb70433fcd11fb0f323a6588de367ecc40672df7f89eb1f"} Apr 20 12:15:05.444910 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.444910 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-964dd4574-ch7vj" event={"ID":"057fb086-1c37-429a-8d5b-48a1306a3deb","Type":"ContainerStarted","Data":"646aaec9ede6cba7fd9946a69a061ca3febc8532a74cb4221a9cd8f1a4e4b81e"} Apr 20 12:15:05.447408 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.447381 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:15:05.447526 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.447480 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" event={"ID":"f7c3b32b-9b04-43fd-b10e-f895344efb6a","Type":"ContainerStarted","Data":"c7a4688ff2ef2627fd2749f6b8fcc68e93b73686740787f0560a458bbd14aa2d"} Apr 20 12:15:05.450076 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.449916 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:15:05.469894 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.469802 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-964dd4574-ch7vj" podStartSLOduration=33.4697896 podStartE2EDuration="33.4697896s" podCreationTimestamp="2026-04-20 12:14:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:15:05.468600331 +0000 UTC m=+71.020824544" watchObservedRunningTime="2026-04-20 12:15:05.4697896 +0000 UTC m=+71.022013787" Apr 20 12:15:05.483516 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.483494 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-t44j5"] Apr 20 12:15:05.485356 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.485301 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" podStartSLOduration=26.338073785 podStartE2EDuration="33.485285945s" podCreationTimestamp="2026-04-20 12:14:32 +0000 UTC" firstStartedPulling="2026-04-20 12:14:37.737345472 +0000 UTC m=+43.289569638" lastFinishedPulling="2026-04-20 12:14:44.88455763 +0000 UTC m=+50.436781798" observedRunningTime="2026-04-20 12:15:05.484628208 +0000 UTC m=+71.036852395" watchObservedRunningTime="2026-04-20 12:15:05.485285945 +0000 UTC m=+71.037510134" Apr 20 12:15:05.486557 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:15:05.486528 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3302a13_45f4_425e_b2d2_ff221a9e7b91.slice/crio-686bcac24f83a37b4fd81f02ee4e3e6fbe4669df79d894415805132cc891e8c7 WatchSource:0}: Error finding container 686bcac24f83a37b4fd81f02ee4e3e6fbe4669df79d894415805132cc891e8c7: Status 404 returned error can't find the container with id 686bcac24f83a37b4fd81f02ee4e3e6fbe4669df79d894415805132cc891e8c7 Apr 20 12:15:05.532697 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:05.532634 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-tsf4n" Apr 20 12:15:06.133334 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:06.133290 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:15:06.135957 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:06.135933 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:15:06.454428 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:06.454357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" event={"ID":"c3302a13-45f4-425e-b2d2-ff221a9e7b91","Type":"ContainerStarted","Data":"686bcac24f83a37b4fd81f02ee4e3e6fbe4669df79d894415805132cc891e8c7"} Apr 20 12:15:06.454941 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:06.454923 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:15:06.456318 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:06.456289 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-964dd4574-ch7vj" Apr 20 12:15:07.458375 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:07.458340 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" event={"ID":"c3302a13-45f4-425e-b2d2-ff221a9e7b91","Type":"ContainerStarted","Data":"cbe1f8f0a9fac994def354a6f871af0b0948d9a485520b818d8d4dea8e9125e1"} Apr 20 12:15:07.476382 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:07.476324 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-t44j5" podStartSLOduration=32.719786537 podStartE2EDuration="34.476304509s" podCreationTimestamp="2026-04-20 12:14:33 +0000 UTC" firstStartedPulling="2026-04-20 12:15:05.488464145 +0000 UTC m=+71.040688310" lastFinishedPulling="2026-04-20 12:15:07.244982112 +0000 UTC m=+72.797206282" observedRunningTime="2026-04-20 12:15:07.473288535 +0000 UTC m=+73.025512723" watchObservedRunningTime="2026-04-20 12:15:07.476304509 +0000 UTC m=+73.028528701" Apr 20 12:15:09.464708 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:09.464675 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" event={"ID":"fa839e80-dc90-4cc7-9ee9-2520a9717383","Type":"ContainerStarted","Data":"060d6db07631d22c189d5b962fc39aeb99f542538f95e48b185bfacbbd8cc8a0"} Apr 20 12:15:09.466420 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:09.466398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" event={"ID":"f923f5f1-db46-4d28-810d-3ed65437dba9","Type":"ContainerStarted","Data":"7ea7c18d65afebc09aeef5c91955ef2431fe986d362b606b3868962f08aa4407"} Apr 20 12:15:09.466538 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:09.466424 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" event={"ID":"f923f5f1-db46-4d28-810d-3ed65437dba9","Type":"ContainerStarted","Data":"02016493e3bc2731be70411716429b5cce121bc24ed62701afb8391b68c0e10d"} Apr 20 12:15:09.495091 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:09.491339 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-cg6bv" podStartSLOduration=33.057186146 podStartE2EDuration="36.491323368s" podCreationTimestamp="2026-04-20 12:14:33 +0000 UTC" firstStartedPulling="2026-04-20 12:15:05.404006551 +0000 UTC m=+70.956230731" lastFinishedPulling="2026-04-20 12:15:08.838143778 +0000 UTC m=+74.390367953" observedRunningTime="2026-04-20 12:15:09.490037732 +0000 UTC m=+75.042261920" watchObservedRunningTime="2026-04-20 12:15:09.491323368 +0000 UTC m=+75.043547558" Apr 20 12:15:09.513468 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:09.513415 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-sb889" podStartSLOduration=33.096533452 podStartE2EDuration="36.513403469s" podCreationTimestamp="2026-04-20 12:14:33 +0000 UTC" firstStartedPulling="2026-04-20 12:15:05.417047777 +0000 UTC m=+70.969271946" lastFinishedPulling="2026-04-20 12:15:08.833917785 +0000 UTC m=+74.386141963" observedRunningTime="2026-04-20 12:15:09.512750191 +0000 UTC m=+75.064974380" watchObservedRunningTime="2026-04-20 12:15:09.513403469 +0000 UTC m=+75.065627657" Apr 20 12:15:11.106907 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.106874 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-54t5x"] Apr 20 12:15:11.143893 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.143827 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-54t5x"] Apr 20 12:15:11.144009 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.143936 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.146617 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.146596 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 12:15:11.146840 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.146822 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-cqzj8\"" Apr 20 12:15:11.147690 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.147665 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 12:15:11.322528 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.322495 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f6be8bd-925b-4342-9fc0-d43291f25a6c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.322528 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.322534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f6be8bd-925b-4342-9fc0-d43291f25a6c-crio-socket\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.322741 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.322572 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f6be8bd-925b-4342-9fc0-d43291f25a6c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.322741 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.322632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkwq6\" (UniqueName: \"kubernetes.io/projected/0f6be8bd-925b-4342-9fc0-d43291f25a6c-kube-api-access-wkwq6\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.322805 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.322742 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f6be8bd-925b-4342-9fc0-d43291f25a6c-data-volume\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.423250 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.423174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkwq6\" (UniqueName: \"kubernetes.io/projected/0f6be8bd-925b-4342-9fc0-d43291f25a6c-kube-api-access-wkwq6\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.423250 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.423230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f6be8bd-925b-4342-9fc0-d43291f25a6c-data-volume\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.423463 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.423254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f6be8bd-925b-4342-9fc0-d43291f25a6c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.423463 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.423280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f6be8bd-925b-4342-9fc0-d43291f25a6c-crio-socket\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.423463 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.423329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f6be8bd-925b-4342-9fc0-d43291f25a6c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.423463 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.423426 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0f6be8bd-925b-4342-9fc0-d43291f25a6c-crio-socket\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.423690 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.423572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0f6be8bd-925b-4342-9fc0-d43291f25a6c-data-volume\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.423829 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.423812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0f6be8bd-925b-4342-9fc0-d43291f25a6c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.425876 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.425844 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0f6be8bd-925b-4342-9fc0-d43291f25a6c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.434809 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.434781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkwq6\" (UniqueName: \"kubernetes.io/projected/0f6be8bd-925b-4342-9fc0-d43291f25a6c-kube-api-access-wkwq6\") pod \"insights-runtime-extractor-54t5x\" (UID: \"0f6be8bd-925b-4342-9fc0-d43291f25a6c\") " pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.453630 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.453608 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-54t5x" Apr 20 12:15:11.573626 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:11.573566 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-54t5x"] Apr 20 12:15:11.575742 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:15:11.575715 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6be8bd_925b_4342_9fc0_d43291f25a6c.slice/crio-a758e435d394a4f5c573413d97d368d3a190e0a11278b245d6076157dbf55baa WatchSource:0}: Error finding container a758e435d394a4f5c573413d97d368d3a190e0a11278b245d6076157dbf55baa: Status 404 returned error can't find the container with id a758e435d394a4f5c573413d97d368d3a190e0a11278b245d6076157dbf55baa Apr 20 12:15:12.476557 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:12.476492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-54t5x" event={"ID":"0f6be8bd-925b-4342-9fc0-d43291f25a6c","Type":"ContainerStarted","Data":"e373211a541012e2299be63f840ec7b304a0be0f981d4df15812ca51cba825da"} Apr 20 12:15:12.476557 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:12.476525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-54t5x" event={"ID":"0f6be8bd-925b-4342-9fc0-d43291f25a6c","Type":"ContainerStarted","Data":"a758e435d394a4f5c573413d97d368d3a190e0a11278b245d6076157dbf55baa"} Apr 20 12:15:13.438619 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:13.438585 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hxqfn" Apr 20 12:15:13.481540 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:13.481502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-54t5x" event={"ID":"0f6be8bd-925b-4342-9fc0-d43291f25a6c","Type":"ContainerStarted","Data":"d8c188e0160777fd2bf4136d32bad74b5c817ef1e3c034ca4aaf7713c2298efb"} Apr 20 12:15:15.493071 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:15.493037 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-54t5x" event={"ID":"0f6be8bd-925b-4342-9fc0-d43291f25a6c","Type":"ContainerStarted","Data":"4b7e522acfbadbf3683656a0498f59506bdb6e73660ddda5e429e4006f92dbd1"} Apr 20 12:15:15.511558 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:15.511519 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-54t5x" podStartSLOduration=1.822433068 podStartE2EDuration="4.511505625s" podCreationTimestamp="2026-04-20 12:15:11 +0000 UTC" firstStartedPulling="2026-04-20 12:15:11.717716284 +0000 UTC m=+77.269940450" lastFinishedPulling="2026-04-20 12:15:14.406788842 +0000 UTC m=+79.959013007" observedRunningTime="2026-04-20 12:15:15.509942804 +0000 UTC m=+81.062166995" watchObservedRunningTime="2026-04-20 12:15:15.511505625 +0000 UTC m=+81.063729813" Apr 20 12:15:19.940811 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:19.940765 2577 patch_prober.go:28] interesting pod/image-registry-5b45df776f-whmlc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 12:15:19.941189 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:19.940872 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" podUID="d6d046ee-c2af-433b-9120-a41c0d53be7b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 12:15:20.910514 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:20.910488 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6lq9d"] Apr 20 12:15:20.915075 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:20.915056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:20.917475 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:20.917451 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 12:15:20.917698 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:20.917680 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 12:15:20.917792 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:20.917716 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-jsw86\"" Apr 20 12:15:20.917792 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:20.917729 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 12:15:20.917792 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:20.917754 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 12:15:21.097000 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.096966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-sys\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.097341 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.097005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-tls\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.097341 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.097026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-textfile\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.097341 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.097076 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-root\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.097341 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.097108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-metrics-client-ca\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.097341 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.097157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.097341 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.097174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-accelerators-collector-config\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.097341 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.097229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-kube-api-access-lvbzs\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.097341 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.097264 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-wtmp\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.198501 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-sys\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.198501 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-tls\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.198705 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-sys\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.198705 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-textfile\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.198705 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-root\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.198705 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-metrics-client-ca\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.198887 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.198887 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-accelerators-collector-config\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.198887 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-root\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.199028 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-kube-api-access-lvbzs\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.199028 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198963 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-wtmp\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.199028 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.198970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-textfile\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.199181 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.199107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-wtmp\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.199251 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.199227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-metrics-client-ca\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.199589 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.199563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-accelerators-collector-config\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.201327 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.201308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-tls\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.201455 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.201438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.220748 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.220733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/7ad7aa28-ceb4-4cd7-9451-71af9edfd101-kube-api-access-lvbzs\") pod \"node-exporter-6lq9d\" (UID: \"7ad7aa28-ceb4-4cd7-9451-71af9edfd101\") " pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.224380 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.224366 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6lq9d" Apr 20 12:15:21.234406 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:15:21.234378 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad7aa28_ceb4_4cd7_9451_71af9edfd101.slice/crio-774943c1a3aea9506982fdd01ff8db07592cdf2fa87c0d9de6cb81cd53d5a3ac WatchSource:0}: Error finding container 774943c1a3aea9506982fdd01ff8db07592cdf2fa87c0d9de6cb81cd53d5a3ac: Status 404 returned error can't find the container with id 774943c1a3aea9506982fdd01ff8db07592cdf2fa87c0d9de6cb81cd53d5a3ac Apr 20 12:15:21.427001 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.426972 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:15:21.510201 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:21.510113 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6lq9d" event={"ID":"7ad7aa28-ceb4-4cd7-9451-71af9edfd101","Type":"ContainerStarted","Data":"774943c1a3aea9506982fdd01ff8db07592cdf2fa87c0d9de6cb81cd53d5a3ac"} Apr 20 12:15:22.514161 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:22.514074 2577 generic.go:358] "Generic (PLEG): container finished" podID="7ad7aa28-ceb4-4cd7-9451-71af9edfd101" containerID="46591c3d086752489432172ef48000a68de4be09e8fb42828886f5de4ffdd08c" exitCode=0 Apr 20 12:15:22.514509 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:22.514150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6lq9d" event={"ID":"7ad7aa28-ceb4-4cd7-9451-71af9edfd101","Type":"ContainerDied","Data":"46591c3d086752489432172ef48000a68de4be09e8fb42828886f5de4ffdd08c"} Apr 20 12:15:23.519501 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:23.519464 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6lq9d" event={"ID":"7ad7aa28-ceb4-4cd7-9451-71af9edfd101","Type":"ContainerStarted","Data":"e4d618195ee62c5532e80d84d02e5d594dcf8a9fc1360fc2647cd25a296b3f57"} Apr 20 12:15:23.519501 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:23.519504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6lq9d" event={"ID":"7ad7aa28-ceb4-4cd7-9451-71af9edfd101","Type":"ContainerStarted","Data":"9a524a44ac5bffc25460a661b7fa9564f4820cdd9f838b288b79d01080428c08"} Apr 20 12:15:23.538471 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:23.538433 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6lq9d" podStartSLOduration=2.656787889 podStartE2EDuration="3.538421005s" podCreationTimestamp="2026-04-20 12:15:20 +0000 UTC" firstStartedPulling="2026-04-20 12:15:21.236292397 +0000 UTC m=+86.788516562" lastFinishedPulling="2026-04-20 12:15:22.117925511 +0000 UTC m=+87.670149678" observedRunningTime="2026-04-20 12:15:23.537316625 +0000 UTC m=+89.089540834" watchObservedRunningTime="2026-04-20 12:15:23.538421005 +0000 UTC m=+89.090645224" Apr 20 12:15:31.425819 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:31.425726 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-gbv8h" Apr 20 12:15:33.475223 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:33.475180 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b45df776f-whmlc"] Apr 20 12:15:55.610306 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:55.610272 2577 generic.go:358] "Generic (PLEG): container finished" podID="df9d28bf-8c80-47b4-8d39-a66df9464d5b" containerID="7d5e254c8785e0bb845082376141ad40ce9187b2ca6a6bfa2593d6cac361ca8c" exitCode=0 Apr 20 12:15:55.610678 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:55.610338 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" event={"ID":"df9d28bf-8c80-47b4-8d39-a66df9464d5b","Type":"ContainerDied","Data":"7d5e254c8785e0bb845082376141ad40ce9187b2ca6a6bfa2593d6cac361ca8c"} Apr 20 12:15:55.610678 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:55.610616 2577 scope.go:117] "RemoveContainer" containerID="7d5e254c8785e0bb845082376141ad40ce9187b2ca6a6bfa2593d6cac361ca8c" Apr 20 12:15:56.615084 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:56.615051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gnj2p" event={"ID":"df9d28bf-8c80-47b4-8d39-a66df9464d5b","Type":"ContainerStarted","Data":"562107fb3394ced94303fb3ce129078d1513e8d5aed966d6da791f1b06ad8d27"} Apr 20 12:15:58.495143 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.495099 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" podUID="d6d046ee-c2af-433b-9120-a41c0d53be7b" containerName="registry" containerID="cri-o://e783b74e8f49dff9e96393a2f5e01949e4ce82dfbf92f0f7b9c71a9f10d5b1fa" gracePeriod=30 Apr 20 12:15:58.623631 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.623603 2577 generic.go:358] "Generic (PLEG): container finished" podID="d6d046ee-c2af-433b-9120-a41c0d53be7b" containerID="e783b74e8f49dff9e96393a2f5e01949e4ce82dfbf92f0f7b9c71a9f10d5b1fa" exitCode=0 Apr 20 12:15:58.623788 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.623671 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" event={"ID":"d6d046ee-c2af-433b-9120-a41c0d53be7b","Type":"ContainerDied","Data":"e783b74e8f49dff9e96393a2f5e01949e4ce82dfbf92f0f7b9c71a9f10d5b1fa"} Apr 20 12:15:58.724426 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.724404 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:15:58.829289 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829212 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-certificates\") pod \"d6d046ee-c2af-433b-9120-a41c0d53be7b\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " Apr 20 12:15:58.829289 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829256 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-bound-sa-token\") pod \"d6d046ee-c2af-433b-9120-a41c0d53be7b\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " Apr 20 12:15:58.829469 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829392 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-installation-pull-secrets\") pod \"d6d046ee-c2af-433b-9120-a41c0d53be7b\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " Apr 20 12:15:58.829469 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829445 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-trusted-ca\") pod \"d6d046ee-c2af-433b-9120-a41c0d53be7b\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " Apr 20 12:15:58.829532 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829473 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-image-registry-private-configuration\") pod \"d6d046ee-c2af-433b-9120-a41c0d53be7b\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " Apr 20 12:15:58.829532 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829501 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") pod \"d6d046ee-c2af-433b-9120-a41c0d53be7b\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " Apr 20 12:15:58.829716 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829528 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs8wt\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-kube-api-access-vs8wt\") pod \"d6d046ee-c2af-433b-9120-a41c0d53be7b\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " Apr 20 12:15:58.829716 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829571 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6d046ee-c2af-433b-9120-a41c0d53be7b-ca-trust-extracted\") pod \"d6d046ee-c2af-433b-9120-a41c0d53be7b\" (UID: \"d6d046ee-c2af-433b-9120-a41c0d53be7b\") " Apr 20 12:15:58.829836 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829697 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d6d046ee-c2af-433b-9120-a41c0d53be7b" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:15:58.829921 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.829898 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-certificates\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 20 12:15:58.830032 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.830001 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d6d046ee-c2af-433b-9120-a41c0d53be7b" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 12:15:58.831978 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.831953 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d6d046ee-c2af-433b-9120-a41c0d53be7b" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:15:58.832284 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.832257 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d6d046ee-c2af-433b-9120-a41c0d53be7b" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:15:58.832284 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.832266 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d6d046ee-c2af-433b-9120-a41c0d53be7b" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 12:15:58.832400 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.832282 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d6d046ee-c2af-433b-9120-a41c0d53be7b" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:15:58.832564 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.832543 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-kube-api-access-vs8wt" (OuterVolumeSpecName: "kube-api-access-vs8wt") pod "d6d046ee-c2af-433b-9120-a41c0d53be7b" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b"). InnerVolumeSpecName "kube-api-access-vs8wt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 12:15:58.839032 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.839007 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6d046ee-c2af-433b-9120-a41c0d53be7b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d6d046ee-c2af-433b-9120-a41c0d53be7b" (UID: "d6d046ee-c2af-433b-9120-a41c0d53be7b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 12:15:58.930668 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.930616 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-installation-pull-secrets\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 20 12:15:58.930668 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.930667 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6d046ee-c2af-433b-9120-a41c0d53be7b-trusted-ca\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 20 12:15:58.930821 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.930684 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d6d046ee-c2af-433b-9120-a41c0d53be7b-image-registry-private-configuration\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 20 12:15:58.930821 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.930697 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-registry-tls\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 20 12:15:58.930821 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.930709 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vs8wt\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-kube-api-access-vs8wt\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 20 12:15:58.930821 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.930722 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6d046ee-c2af-433b-9120-a41c0d53be7b-ca-trust-extracted\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 20 12:15:58.930821 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:58.930734 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6d046ee-c2af-433b-9120-a41c0d53be7b-bound-sa-token\") on node \"ip-10-0-131-55.ec2.internal\" DevicePath \"\"" Apr 20 12:15:59.628284 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:59.628248 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" event={"ID":"d6d046ee-c2af-433b-9120-a41c0d53be7b","Type":"ContainerDied","Data":"248da8158de7433ffa63109c1e382003c3d19b1e28f27073ff48d463becc5126"} Apr 20 12:15:59.628284 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:59.628266 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b45df776f-whmlc" Apr 20 12:15:59.628844 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:59.628294 2577 scope.go:117] "RemoveContainer" containerID="e783b74e8f49dff9e96393a2f5e01949e4ce82dfbf92f0f7b9c71a9f10d5b1fa" Apr 20 12:15:59.649791 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:59.649769 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5b45df776f-whmlc"] Apr 20 12:15:59.652763 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:15:59.652740 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5b45df776f-whmlc"] Apr 20 12:16:01.070542 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:01.070509 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6d046ee-c2af-433b-9120-a41c0d53be7b" path="/var/lib/kubelet/pods/d6d046ee-c2af-433b-9120-a41c0d53be7b/volumes" Apr 20 12:16:03.875291 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:03.875248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:16:03.877702 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:03.877682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/016f5832-4461-44e1-b03e-5ca0dc88515d-metrics-certs\") pod \"network-metrics-daemon-jnnsm\" (UID: \"016f5832-4461-44e1-b03e-5ca0dc88515d\") " pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:16:03.887152 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:03.887132 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kwdzh\"" Apr 20 12:16:03.895305 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:03.895290 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jnnsm" Apr 20 12:16:04.014810 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:04.014788 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jnnsm"] Apr 20 12:16:04.016950 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:16:04.016922 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod016f5832_4461_44e1_b03e_5ca0dc88515d.slice/crio-fe60b02f714adb4f6b303da7124c48d6b6ddb7266092a7d06f827379b88b0267 WatchSource:0}: Error finding container fe60b02f714adb4f6b303da7124c48d6b6ddb7266092a7d06f827379b88b0267: Status 404 returned error can't find the container with id fe60b02f714adb4f6b303da7124c48d6b6ddb7266092a7d06f827379b88b0267 Apr 20 12:16:04.643369 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:04.643332 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jnnsm" event={"ID":"016f5832-4461-44e1-b03e-5ca0dc88515d","Type":"ContainerStarted","Data":"fe60b02f714adb4f6b303da7124c48d6b6ddb7266092a7d06f827379b88b0267"} Apr 20 12:16:05.648046 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:05.648013 2577 generic.go:358] "Generic (PLEG): container finished" podID="b845784c-6d33-44f1-8015-6ea907093662" containerID="d2d1c4d1667f9a6d7410deef21bc4f68e71d3b1ca3e1a0e1d9b9fddddc5c009d" exitCode=0 Apr 20 12:16:05.648419 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:05.648096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" event={"ID":"b845784c-6d33-44f1-8015-6ea907093662","Type":"ContainerDied","Data":"d2d1c4d1667f9a6d7410deef21bc4f68e71d3b1ca3e1a0e1d9b9fddddc5c009d"} Apr 20 12:16:05.648478 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:05.648462 2577 scope.go:117] "RemoveContainer" containerID="d2d1c4d1667f9a6d7410deef21bc4f68e71d3b1ca3e1a0e1d9b9fddddc5c009d" Apr 20 12:16:05.649752 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:05.649725 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jnnsm" event={"ID":"016f5832-4461-44e1-b03e-5ca0dc88515d","Type":"ContainerStarted","Data":"4ad3ac278a5a61563003178d703a8714fc18a51f444c877470821d27b62286d1"} Apr 20 12:16:05.649863 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:05.649763 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jnnsm" event={"ID":"016f5832-4461-44e1-b03e-5ca0dc88515d","Type":"ContainerStarted","Data":"074b15338b5af0bb96270a768b1152c8356d38b05aa3432eff2266b1db817ea3"} Apr 20 12:16:05.686376 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:05.686325 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jnnsm" podStartSLOduration=129.731433955 podStartE2EDuration="2m10.686308289s" podCreationTimestamp="2026-04-20 12:13:55 +0000 UTC" firstStartedPulling="2026-04-20 12:16:04.018672212 +0000 UTC m=+129.570896379" lastFinishedPulling="2026-04-20 12:16:04.973546544 +0000 UTC m=+130.525770713" observedRunningTime="2026-04-20 12:16:05.685926099 +0000 UTC m=+131.238150300" watchObservedRunningTime="2026-04-20 12:16:05.686308289 +0000 UTC m=+131.238532478" Apr 20 12:16:06.654930 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:06.654896 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-x6nvq" event={"ID":"b845784c-6d33-44f1-8015-6ea907093662","Type":"ContainerStarted","Data":"693f3a0237eae113635d3a9b19b7464b463ff6b7f27415da14d78c6187893b33"} Apr 20 12:16:11.670359 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:11.670324 2577 generic.go:358] "Generic (PLEG): container finished" podID="390328d7-a7ce-4e5b-bb2b-853e4f3b21d7" containerID="96298c2ee1784a3326952759e70d20539ac45b3fb2d85d3b55095ebe538feb15" exitCode=0 Apr 20 12:16:11.670742 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:11.670399 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hlmfm" event={"ID":"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7","Type":"ContainerDied","Data":"96298c2ee1784a3326952759e70d20539ac45b3fb2d85d3b55095ebe538feb15"} Apr 20 12:16:11.670791 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:11.670747 2577 scope.go:117] "RemoveContainer" containerID="96298c2ee1784a3326952759e70d20539ac45b3fb2d85d3b55095ebe538feb15" Apr 20 12:16:12.675527 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:12.675495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-hlmfm" event={"ID":"390328d7-a7ce-4e5b-bb2b-853e4f3b21d7","Type":"ContainerStarted","Data":"0b3b20b7d56ce9f4580b78dd32ee5e146bafe82d177f197ef5dface898fd53e2"} Apr 20 12:16:18.122421 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:18.122383 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" podUID="c3717894-6e26-4912-a687-87e36b6785a8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 12:16:28.123173 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:28.123130 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" podUID="c3717894-6e26-4912-a687-87e36b6785a8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 12:16:38.122482 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:38.122443 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" podUID="c3717894-6e26-4912-a687-87e36b6785a8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 12:16:38.122942 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:38.122522 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" Apr 20 12:16:38.123029 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:38.122998 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"0b8375c1431e511163eeb7a0854734d4721d818858e16b23714f7dd9f2d4f1e0"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 12:16:38.123066 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:38.123050 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" podUID="c3717894-6e26-4912-a687-87e36b6785a8" containerName="service-proxy" containerID="cri-o://0b8375c1431e511163eeb7a0854734d4721d818858e16b23714f7dd9f2d4f1e0" gracePeriod=30 Apr 20 12:16:38.749203 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:38.749169 2577 generic.go:358] "Generic (PLEG): container finished" podID="c3717894-6e26-4912-a687-87e36b6785a8" containerID="0b8375c1431e511163eeb7a0854734d4721d818858e16b23714f7dd9f2d4f1e0" exitCode=2 Apr 20 12:16:38.749373 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:38.749232 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" event={"ID":"c3717894-6e26-4912-a687-87e36b6785a8","Type":"ContainerDied","Data":"0b8375c1431e511163eeb7a0854734d4721d818858e16b23714f7dd9f2d4f1e0"} Apr 20 12:16:38.749373 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:16:38.749271 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86f59879c9-fpxkw" event={"ID":"c3717894-6e26-4912-a687-87e36b6785a8","Type":"ContainerStarted","Data":"56856e2d08bc0774d0b00b77fd2a442b552528d1cfaa735474fa97c1ec867ef0"} Apr 20 12:18:54.958080 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:18:54.958055 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:18:54.963620 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:18:54.963598 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:18:54.969765 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:18:54.969746 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 12:23:54.981115 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:23:54.981090 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:23:54.982943 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:23:54.982924 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:28:55.001531 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:28:55.001450 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:28:55.002147 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:28:55.001524 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:33:55.033594 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:33:55.033561 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:33:55.034156 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:33:55.033600 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:38:55.052418 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:38:55.052381 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:38:55.053247 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:38:55.053229 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:43:55.073390 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:43:55.073364 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:43:55.075298 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:43:55.075276 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:48:55.092333 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:48:55.092300 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:48:55.094980 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:48:55.094959 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:53:55.111489 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:53:55.111464 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:53:55.115711 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:53:55.115685 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:58:55.133879 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:58:55.133752 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:58:55.137612 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:58:55.137227 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:59:25.100454 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.100420 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-chpds/must-gather-2gn9n"] Apr 20 12:59:25.100893 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.100687 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6d046ee-c2af-433b-9120-a41c0d53be7b" containerName="registry" Apr 20 12:59:25.100893 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.100697 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6d046ee-c2af-433b-9120-a41c0d53be7b" containerName="registry" Apr 20 12:59:25.100893 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.100754 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6d046ee-c2af-433b-9120-a41c0d53be7b" containerName="registry" Apr 20 12:59:25.105661 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.105055 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chpds/must-gather-2gn9n" Apr 20 12:59:25.108163 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.108137 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chpds\"/\"kube-root-ca.crt\"" Apr 20 12:59:25.108163 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.108150 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-chpds\"/\"openshift-service-ca.crt\"" Apr 20 12:59:25.108354 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.108269 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-chpds\"/\"default-dockercfg-pv8sl\"" Apr 20 12:59:25.110851 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.110830 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chpds/must-gather-2gn9n"] Apr 20 12:59:25.128898 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.128874 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdjkm\" (UniqueName: \"kubernetes.io/projected/938e6660-c3ce-4b4f-9ba7-06f22d9104a8-kube-api-access-tdjkm\") pod \"must-gather-2gn9n\" (UID: \"938e6660-c3ce-4b4f-9ba7-06f22d9104a8\") " pod="openshift-must-gather-chpds/must-gather-2gn9n" Apr 20 12:59:25.128989 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.128917 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/938e6660-c3ce-4b4f-9ba7-06f22d9104a8-must-gather-output\") pod \"must-gather-2gn9n\" (UID: \"938e6660-c3ce-4b4f-9ba7-06f22d9104a8\") " pod="openshift-must-gather-chpds/must-gather-2gn9n" Apr 20 12:59:25.229429 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.229398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdjkm\" (UniqueName: \"kubernetes.io/projected/938e6660-c3ce-4b4f-9ba7-06f22d9104a8-kube-api-access-tdjkm\") pod \"must-gather-2gn9n\" (UID: \"938e6660-c3ce-4b4f-9ba7-06f22d9104a8\") " pod="openshift-must-gather-chpds/must-gather-2gn9n" Apr 20 12:59:25.229581 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.229449 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/938e6660-c3ce-4b4f-9ba7-06f22d9104a8-must-gather-output\") pod \"must-gather-2gn9n\" (UID: \"938e6660-c3ce-4b4f-9ba7-06f22d9104a8\") " pod="openshift-must-gather-chpds/must-gather-2gn9n" Apr 20 12:59:25.229759 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.229744 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/938e6660-c3ce-4b4f-9ba7-06f22d9104a8-must-gather-output\") pod \"must-gather-2gn9n\" (UID: \"938e6660-c3ce-4b4f-9ba7-06f22d9104a8\") " pod="openshift-must-gather-chpds/must-gather-2gn9n" Apr 20 12:59:25.238238 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.238216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdjkm\" (UniqueName: \"kubernetes.io/projected/938e6660-c3ce-4b4f-9ba7-06f22d9104a8-kube-api-access-tdjkm\") pod \"must-gather-2gn9n\" (UID: \"938e6660-c3ce-4b4f-9ba7-06f22d9104a8\") " pod="openshift-must-gather-chpds/must-gather-2gn9n" Apr 20 12:59:25.415353 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.415327 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chpds/must-gather-2gn9n" Apr 20 12:59:25.539653 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.539620 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chpds/must-gather-2gn9n"] Apr 20 12:59:25.542191 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:59:25.542164 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod938e6660_c3ce_4b4f_9ba7_06f22d9104a8.slice/crio-6a820c5fb64f997de5bffe5abb75b5941f10f54590f7868780983015a9609ee1 WatchSource:0}: Error finding container 6a820c5fb64f997de5bffe5abb75b5941f10f54590f7868780983015a9609ee1: Status 404 returned error can't find the container with id 6a820c5fb64f997de5bffe5abb75b5941f10f54590f7868780983015a9609ee1 Apr 20 12:59:25.543784 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.543767 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 12:59:25.853469 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:25.853383 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chpds/must-gather-2gn9n" event={"ID":"938e6660-c3ce-4b4f-9ba7-06f22d9104a8","Type":"ContainerStarted","Data":"6a820c5fb64f997de5bffe5abb75b5941f10f54590f7868780983015a9609ee1"} Apr 20 12:59:26.858212 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:26.858182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chpds/must-gather-2gn9n" event={"ID":"938e6660-c3ce-4b4f-9ba7-06f22d9104a8","Type":"ContainerStarted","Data":"cef40aa0d30306539bddfff619553cf6b0b15030d3e69a8c99877c41779ac67a"} Apr 20 12:59:26.858212 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:26.858218 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chpds/must-gather-2gn9n" event={"ID":"938e6660-c3ce-4b4f-9ba7-06f22d9104a8","Type":"ContainerStarted","Data":"f5214f700971598be733f51a881e49a892d10a172c30d8deaae968419c70ad58"} Apr 20 12:59:26.875799 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:26.875754 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-chpds/must-gather-2gn9n" podStartSLOduration=0.97771538 podStartE2EDuration="1.875740601s" podCreationTimestamp="2026-04-20 12:59:25 +0000 UTC" firstStartedPulling="2026-04-20 12:59:25.543892367 +0000 UTC m=+2731.096116532" lastFinishedPulling="2026-04-20 12:59:26.441917588 +0000 UTC m=+2731.994141753" observedRunningTime="2026-04-20 12:59:26.873250803 +0000 UTC m=+2732.425474992" watchObservedRunningTime="2026-04-20 12:59:26.875740601 +0000 UTC m=+2732.427964789" Apr 20 12:59:27.830240 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:27.830198 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rkbqg_51f2c5d6-8d34-4caf-b764-5fd970fa149b/global-pull-secret-syncer/0.log" Apr 20 12:59:27.946211 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:27.946185 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vwq8z_fbe4372b-f6e0-4562-b969-fe5fdeed773a/konnectivity-agent/0.log" Apr 20 12:59:27.965232 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:27.965200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-55.ec2.internal_ec74c72a1ae3da2b3b1eef59bb72e15d/haproxy/0.log" Apr 20 12:59:31.286788 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:31.286741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-cg6bv_fa839e80-dc90-4cc7-9ee9-2520a9717383/cluster-monitoring-operator/0.log" Apr 20 12:59:31.591310 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:31.591283 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6lq9d_7ad7aa28-ceb4-4cd7-9451-71af9edfd101/node-exporter/0.log" Apr 20 12:59:31.609882 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:31.609856 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6lq9d_7ad7aa28-ceb4-4cd7-9451-71af9edfd101/kube-rbac-proxy/0.log" Apr 20 12:59:31.631160 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:31.631131 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6lq9d_7ad7aa28-ceb4-4cd7-9451-71af9edfd101/init-textfile/0.log" Apr 20 12:59:33.199503 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:33.199472 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-t44j5_c3302a13-45f4-425e-b2d2-ff221a9e7b91/networking-console-plugin/0.log" Apr 20 12:59:33.604139 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:33.604046 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/1.log" Apr 20 12:59:33.610485 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:33.610458 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tsf4n_f7c3b32b-9b04-43fd-b10e-f895344efb6a/console-operator/2.log" Apr 20 12:59:34.367221 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.367191 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-vb8dx_320a9c81-9dfd-4ada-9ef3-fab78e5c337d/volume-data-source-validator/0.log" Apr 20 12:59:34.376097 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.376071 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7"] Apr 20 12:59:34.380989 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.380967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.386201 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.386178 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7"] Apr 20 12:59:34.514430 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.514396 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-lib-modules\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.514606 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.514450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-podres\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.514606 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.514532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-proc\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.514606 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.514582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-sys\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.514776 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.514605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmnz\" (UniqueName: \"kubernetes.io/projected/b981654c-4a68-4502-bfdf-68802a163bd4-kube-api-access-vhmnz\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.615958 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.615926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-proc\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.616107 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.615972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-sys\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.616107 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.616033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-sys\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.616107 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.616051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-proc\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.616209 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.616098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmnz\" (UniqueName: \"kubernetes.io/projected/b981654c-4a68-4502-bfdf-68802a163bd4-kube-api-access-vhmnz\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.616209 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.616148 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-lib-modules\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.616279 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.616214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-podres\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.616331 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.616318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-lib-modules\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.616404 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.616367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b981654c-4a68-4502-bfdf-68802a163bd4-podres\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.625393 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.625325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmnz\" (UniqueName: \"kubernetes.io/projected/b981654c-4a68-4502-bfdf-68802a163bd4-kube-api-access-vhmnz\") pod \"perf-node-gather-daemonset-qg2w7\" (UID: \"b981654c-4a68-4502-bfdf-68802a163bd4\") " pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.692426 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.692383 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:34.830117 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.829998 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7"] Apr 20 12:59:34.832943 ip-10-0-131-55 kubenswrapper[2577]: W0420 12:59:34.832919 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb981654c_4a68_4502_bfdf_68802a163bd4.slice/crio-10e555d44f7ba627da5b60e70eedb290d9890b7d78f1c3959f797f85590a084c WatchSource:0}: Error finding container 10e555d44f7ba627da5b60e70eedb290d9890b7d78f1c3959f797f85590a084c: Status 404 returned error can't find the container with id 10e555d44f7ba627da5b60e70eedb290d9890b7d78f1c3959f797f85590a084c Apr 20 12:59:34.888065 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.887299 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" event={"ID":"b981654c-4a68-4502-bfdf-68802a163bd4","Type":"ContainerStarted","Data":"10e555d44f7ba627da5b60e70eedb290d9890b7d78f1c3959f797f85590a084c"} Apr 20 12:59:34.975473 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.975447 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hxqfn_be5daddb-3077-4d22-8e15-d75f45ef9c2a/dns/0.log" Apr 20 12:59:34.994853 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:34.994830 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hxqfn_be5daddb-3077-4d22-8e15-d75f45ef9c2a/kube-rbac-proxy/0.log" Apr 20 12:59:35.130771 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:35.130749 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bj5lp_24b836ac-13ec-49aa-be4b-4250c8e79676/dns-node-resolver/0.log" Apr 20 12:59:35.582947 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:35.582925 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zdf9s_a182a959-9bf8-48fe-b024-32a9f697eb23/node-ca/0.log" Apr 20 12:59:35.891907 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:35.891879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" event={"ID":"b981654c-4a68-4502-bfdf-68802a163bd4","Type":"ContainerStarted","Data":"7a3b9dd3a31a94567bd842910852e05caaa304a44ba668dac65e9e3843ed7a73"} Apr 20 12:59:35.892068 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:35.892023 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:35.908076 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:35.908029 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" podStartSLOduration=1.908012072 podStartE2EDuration="1.908012072s" podCreationTimestamp="2026-04-20 12:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 12:59:35.906409433 +0000 UTC m=+2741.458633634" watchObservedRunningTime="2026-04-20 12:59:35.908012072 +0000 UTC m=+2741.460236260" Apr 20 12:59:36.208431 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:36.208355 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-964dd4574-ch7vj_057fb086-1c37-429a-8d5b-48a1306a3deb/router/0.log" Apr 20 12:59:36.545023 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:36.544957 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nnqtq_37056e79-d3b3-4b8c-954f-232d91e2a9a6/serve-healthcheck-canary/0.log" Apr 20 12:59:36.888290 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:36.888265 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hlmfm_390328d7-a7ce-4e5b-bb2b-853e4f3b21d7/insights-operator/0.log" Apr 20 12:59:36.889389 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:36.889369 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-hlmfm_390328d7-a7ce-4e5b-bb2b-853e4f3b21d7/insights-operator/1.log" Apr 20 12:59:36.909324 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:36.909299 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-54t5x_0f6be8bd-925b-4342-9fc0-d43291f25a6c/kube-rbac-proxy/0.log" Apr 20 12:59:36.928548 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:36.928528 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-54t5x_0f6be8bd-925b-4342-9fc0-d43291f25a6c/exporter/0.log" Apr 20 12:59:36.947776 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:36.947754 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-54t5x_0f6be8bd-925b-4342-9fc0-d43291f25a6c/extractor/0.log" Apr 20 12:59:41.904732 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:41.904703 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-chpds/perf-node-gather-daemonset-qg2w7" Apr 20 12:59:42.140424 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:42.140390 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gnj2p_df9d28bf-8c80-47b4-8d39-a66df9464d5b/kube-storage-version-migrator-operator/1.log" Apr 20 12:59:42.141796 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:42.141764 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gnj2p_df9d28bf-8c80-47b4-8d39-a66df9464d5b/kube-storage-version-migrator-operator/0.log" Apr 20 12:59:43.089823 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.089731 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4m89_272f753b-f685-4425-8290-d42ee3ab9738/kube-multus-additional-cni-plugins/0.log" Apr 20 12:59:43.111298 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.111268 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4m89_272f753b-f685-4425-8290-d42ee3ab9738/egress-router-binary-copy/0.log" Apr 20 12:59:43.131381 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.131352 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4m89_272f753b-f685-4425-8290-d42ee3ab9738/cni-plugins/0.log" Apr 20 12:59:43.153682 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.153657 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4m89_272f753b-f685-4425-8290-d42ee3ab9738/bond-cni-plugin/0.log" Apr 20 12:59:43.177463 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.177438 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4m89_272f753b-f685-4425-8290-d42ee3ab9738/routeoverride-cni/0.log" Apr 20 12:59:43.201689 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.201662 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4m89_272f753b-f685-4425-8290-d42ee3ab9738/whereabouts-cni-bincopy/0.log" Apr 20 12:59:43.221802 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.221776 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4m89_272f753b-f685-4425-8290-d42ee3ab9738/whereabouts-cni/0.log" Apr 20 12:59:43.613340 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.613311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dfjcx_f58cdd5e-92df-4b3b-b634-065a2b1275f5/kube-multus/0.log" Apr 20 12:59:43.731333 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.731256 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jnnsm_016f5832-4461-44e1-b03e-5ca0dc88515d/network-metrics-daemon/0.log" Apr 20 12:59:43.753491 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:43.753431 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jnnsm_016f5832-4461-44e1-b03e-5ca0dc88515d/kube-rbac-proxy/0.log" Apr 20 12:59:45.009244 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:45.009200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmhl5_2d527830-9151-40c6-884f-3c8497f96667/ovn-controller/0.log" Apr 20 12:59:45.043683 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:45.043590 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmhl5_2d527830-9151-40c6-884f-3c8497f96667/ovn-acl-logging/0.log" Apr 20 12:59:45.064269 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:45.064243 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmhl5_2d527830-9151-40c6-884f-3c8497f96667/kube-rbac-proxy-node/0.log" Apr 20 12:59:45.084662 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:45.084609 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmhl5_2d527830-9151-40c6-884f-3c8497f96667/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 12:59:45.106697 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:45.106668 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmhl5_2d527830-9151-40c6-884f-3c8497f96667/northd/0.log" Apr 20 12:59:45.128354 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:45.128315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmhl5_2d527830-9151-40c6-884f-3c8497f96667/nbdb/0.log" Apr 20 12:59:45.151422 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:45.151398 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmhl5_2d527830-9151-40c6-884f-3c8497f96667/sbdb/0.log" Apr 20 12:59:45.265675 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:45.265625 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zmhl5_2d527830-9151-40c6-884f-3c8497f96667/ovnkube-controller/0.log" Apr 20 12:59:46.189060 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:46.189028 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-48fqr_1d2db534-7694-4bb0-bf28-bfdd20993c08/check-endpoints/0.log" Apr 20 12:59:46.251274 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:46.251243 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-gbv8h_93e3b405-9e2d-44f9-8fc2-b7a191baecfe/network-check-target-container/0.log" Apr 20 12:59:47.182271 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:47.182245 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-j2n82_d752e3f9-624e-42e0-8b33-7285148161c0/iptables-alerter/0.log" Apr 20 12:59:47.862962 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:47.862931 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hcg2g_72c6c42a-45e7-4a4c-8577-2984a8123380/tuned/0.log" Apr 20 12:59:49.415443 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:49.415407 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-sb889_f923f5f1-db46-4d28-810d-3ed65437dba9/cluster-samples-operator/0.log" Apr 20 12:59:49.431826 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:49.431798 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-sb889_f923f5f1-db46-4d28-810d-3ed65437dba9/cluster-samples-operator-watch/0.log" Apr 20 12:59:50.279553 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:50.279510 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-x6nvq_b845784c-6d33-44f1-8015-6ea907093662/service-ca-operator/1.log" Apr 20 12:59:50.280762 ip-10-0-131-55 kubenswrapper[2577]: I0420 12:59:50.280737 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-x6nvq_b845784c-6d33-44f1-8015-6ea907093662/service-ca-operator/0.log"