Apr 21 07:08:35.207826 ip-10-0-131-184 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 07:08:35.207957 ip-10-0-131-184 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 07:08:35.208049 ip-10-0-131-184 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 07:08:35.208446 ip-10-0-131-184 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 07:08:45.323283 ip-10-0-131-184 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 07:08:45.323298 ip-10-0-131-184 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 683a8565bd854c98a7799147542c6d66 -- Apr 21 07:10:49.908794 ip-10-0-131-184 systemd[1]: Starting Kubernetes Kubelet... Apr 21 07:10:50.374631 ip-10-0-131-184 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:10:50.374631 ip-10-0-131-184 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 07:10:50.374631 ip-10-0-131-184 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:10:50.374631 ip-10-0-131-184 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 07:10:50.374631 ip-10-0-131-184 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:10:50.376354 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.376210 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 07:10:50.378680 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378664 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:10:50.378680 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378679 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378683 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378686 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378689 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378694 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378697 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378700 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378703 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378706 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378709 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378716 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378720 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378722 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378725 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378728 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378731 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378738 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378741 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378744 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:10:50.378747 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378747 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378749 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378752 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378755 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378758 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378761 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378763 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378766 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378768 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378770 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378773 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378775 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378778 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378781 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378783 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378785 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378788 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378790 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378793 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:10:50.379186 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378795 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378798 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378801 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378803 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378806 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378808 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378810 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378813 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378815 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378818 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378820 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378822 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378825 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378827 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378830 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378833 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378836 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378838 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378841 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378843 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:10:50.379650 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378845 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378848 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378850 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378853 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378855 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378857 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378860 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378862 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378865 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378868 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378871 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378874 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378876 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378879 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378881 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378884 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378886 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378889 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378891 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:10:50.380122 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378893 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378896 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378898 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378901 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378903 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378905 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378908 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.378910 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379289 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379294 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379297 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379300 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379302 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379305 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379308 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379311 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379313 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379316 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379319 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379321 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:10:50.380578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379325 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379328 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379330 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379333 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379335 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379338 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379341 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379345 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379349 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379351 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379354 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379357 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379360 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379363 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379365 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379368 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379370 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379372 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379375 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:10:50.381047 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379377 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379381 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379383 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379386 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379388 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379392 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379395 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379398 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379400 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379403 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379405 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379407 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379410 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379412 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379415 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379418 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379420 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379423 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379426 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379428 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:10:50.381504 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379431 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379433 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379435 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379438 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379440 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379443 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379445 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379447 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379450 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379452 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379455 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379457 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379459 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379462 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379465 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379467 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379470 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379472 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379476 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379478 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:10:50.381986 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379481 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379483 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379486 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379488 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379491 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379493 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379496 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379499 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379501 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379504 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379506 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379509 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379511 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379514 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.379516 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381158 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381172 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381181 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381185 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381189 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381193 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 07:10:50.382486 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381198 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381202 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381206 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381209 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381213 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381216 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381219 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381222 2567 flags.go:64] FLAG: --cgroup-root="" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381225 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381228 2567 flags.go:64] FLAG: --client-ca-file="" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381231 2567 flags.go:64] FLAG: --cloud-config="" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381234 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381237 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381241 2567 flags.go:64] FLAG: --cluster-domain="" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381244 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381247 2567 flags.go:64] FLAG: --config-dir="" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381249 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381253 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381257 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381260 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381264 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381268 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381271 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381274 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 07:10:50.382977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381277 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381280 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381283 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381287 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381291 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381294 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381296 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381299 2567 flags.go:64] FLAG: --enable-server="true" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381302 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381306 2567 flags.go:64] FLAG: --event-burst="100" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381310 2567 flags.go:64] FLAG: --event-qps="50" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381313 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381316 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381319 2567 flags.go:64] FLAG: --eviction-hard="" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381323 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381326 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381329 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381332 2567 flags.go:64] FLAG: --eviction-soft="" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381335 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381338 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381341 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381344 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381347 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381350 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381352 2567 flags.go:64] FLAG: --feature-gates="" Apr 21 07:10:50.383544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381356 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381359 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381362 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381366 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381369 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381373 2567 flags.go:64] FLAG: --help="false" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381376 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381379 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381381 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381384 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381387 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381392 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381394 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381397 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381400 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381403 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381405 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381408 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381411 2567 flags.go:64] FLAG: --kube-reserved="" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381414 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381417 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381420 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381423 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381426 2567 flags.go:64] FLAG: --lock-file="" Apr 21 07:10:50.384160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381428 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381431 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381434 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381439 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381442 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381445 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381448 2567 flags.go:64] FLAG: --logging-format="text" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381451 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381454 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381457 2567 flags.go:64] FLAG: --manifest-url="" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381460 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381464 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381467 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381472 2567 flags.go:64] FLAG: --max-pods="110" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381475 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381478 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381480 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381483 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381486 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381489 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381492 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381500 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381503 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381506 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 07:10:50.384743 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381509 2567 flags.go:64] FLAG: --pod-cidr="" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381512 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381518 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381533 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381536 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381539 2567 flags.go:64] FLAG: --port="10250" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381542 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381545 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06fdaf560e13efd3e" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381548 2567 flags.go:64] FLAG: --qos-reserved="" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381551 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381554 2567 flags.go:64] FLAG: --register-node="true" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381557 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381560 2567 flags.go:64] FLAG: --register-with-taints="" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381563 2567 flags.go:64] FLAG: --registry-burst="10" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381566 2567 flags.go:64] FLAG: --registry-qps="5" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381569 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381572 2567 flags.go:64] FLAG: --reserved-memory="" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381575 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381579 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381581 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381585 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381588 2567 flags.go:64] FLAG: --runonce="false" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381591 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381595 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381597 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 21 07:10:50.385337 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381600 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381603 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381606 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381609 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381612 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381615 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381619 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381622 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381624 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381627 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381630 2567 flags.go:64] FLAG: --system-cgroups="" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381633 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381638 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381641 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381644 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381648 2567 flags.go:64] FLAG: --tls-min-version="" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381651 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381654 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381657 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381660 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381663 2567 flags.go:64] FLAG: --v="2" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381667 2567 flags.go:64] FLAG: --version="false" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381671 2567 flags.go:64] FLAG: --vmodule="" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381675 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.381679 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 07:10:50.385971 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381767 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381770 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381773 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381776 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381785 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381788 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381790 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381793 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381796 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381798 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381801 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381804 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381806 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381809 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381812 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381814 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381817 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381819 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381822 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:10:50.386657 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381825 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381827 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381830 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381832 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381835 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381837 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381839 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381842 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381845 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381847 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381849 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381854 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381857 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381860 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381863 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381866 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381868 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381871 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381878 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381881 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:10:50.387123 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381884 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381886 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381889 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381894 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381896 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381899 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381901 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381905 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381914 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381916 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381919 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381922 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381924 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381927 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381929 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381931 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381934 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381936 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381939 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381941 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:10:50.387642 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381944 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381947 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381949 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381952 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381954 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381956 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381959 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381961 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381964 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381967 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381969 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381977 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381980 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381983 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381985 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381989 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381992 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381994 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381997 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.381999 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:10:50.388148 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.382002 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:10:50.388652 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.382005 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:10:50.388652 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.382007 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:10:50.388652 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.382010 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:10:50.388652 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.382012 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:10:50.388652 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.382015 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:10:50.388652 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.382017 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:10:50.388652 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.382792 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:10:50.390992 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.390974 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 07:10:50.390992 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.390992 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 07:10:50.391065 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391059 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391066 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391070 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391073 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391076 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391079 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391081 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391084 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391086 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391089 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391092 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391094 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391096 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:10:50.391096 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391101 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391104 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391107 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391110 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391112 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391115 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391117 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391120 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391122 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391125 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391127 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391129 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391132 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391135 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391137 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391139 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391142 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391144 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391146 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391149 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:10:50.391416 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391151 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391154 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391156 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391159 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391161 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391177 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391180 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391183 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391186 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391190 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391194 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391198 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391200 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391204 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391208 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391211 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391213 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391216 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391218 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391221 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:10:50.391902 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391223 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391226 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391228 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391231 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391233 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391236 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391239 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391241 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391244 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391246 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391249 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391259 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391262 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391266 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391269 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391271 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391274 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391276 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391278 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391281 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:10:50.392381 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391284 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391286 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391289 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391291 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391294 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391297 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391301 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391304 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391307 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391309 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391312 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391315 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391317 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.391323 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391412 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:10:50.392872 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391416 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391419 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391423 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391425 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391428 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391431 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391433 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391436 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391438 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391441 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391443 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391446 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391448 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391451 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391454 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391456 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391459 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391462 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391464 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391466 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:10:50.393232 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391469 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391472 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391474 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391477 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391481 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391485 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391488 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391490 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391493 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391495 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391498 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391500 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391503 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391505 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391508 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391511 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391513 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391515 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391518 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:10:50.393724 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391537 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391540 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391543 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391545 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391548 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391550 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391553 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391555 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391558 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391561 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391563 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391566 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391568 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391571 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391573 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391576 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391579 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391581 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391585 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391587 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:10:50.394173 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391590 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391593 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391595 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391598 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391600 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391603 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391605 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391607 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391610 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391613 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391615 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391617 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391620 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391624 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391627 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391630 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391633 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391636 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391639 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391641 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:10:50.394788 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391644 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:10:50.395262 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391646 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:10:50.395262 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391649 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:10:50.395262 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391652 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:10:50.395262 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391654 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:10:50.395262 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:50.391657 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:10:50.395262 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.391662 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:10:50.395262 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.392445 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 07:10:50.395749 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.395736 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 07:10:50.396752 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.396741 2567 server.go:1019] "Starting client certificate rotation" Apr 21 07:10:50.396854 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.396838 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:10:50.396887 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.396878 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:10:50.426727 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.426710 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:10:50.432624 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.432602 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:10:50.448120 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.448101 2567 log.go:25] "Validated CRI v1 runtime API" Apr 21 07:10:50.454997 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.454981 2567 log.go:25] "Validated CRI v1 image API" Apr 21 07:10:50.457812 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.457792 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 07:10:50.460283 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.460262 2567 fs.go:135] Filesystem UUIDs: map[0a4c57ef-6f5a-43e0-9ae7-38284c4d5c39:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b8c72c1b-5ec1-414a-92a3-9f22a2e9dd87:/dev/nvme0n1p3] Apr 21 07:10:50.460372 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.460283 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 07:10:50.467907 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.467789 2567 manager.go:217] Machine: {Timestamp:2026-04-21 07:10:50.463950962 +0000 UTC m=+0.432327191 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3193617 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21f1a1d5022747b6189dc2ef10babd SystemUUID:ec21f1a1-d502-2747-b618-9dc2ef10babd BootID:683a8565-bd85-4c98-a779-9147542c6d66 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:98:4c:1a:38:db Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:98:4c:1a:38:db Speed:0 Mtu:9001} {Name:ovs-system MacAddress:76:7e:54:9a:d3:82 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 07:10:50.467907 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.467896 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 07:10:50.468060 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.468003 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 07:10:50.471644 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.471616 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 07:10:50.471816 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.471647 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-184.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 07:10:50.471897 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.471830 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 07:10:50.471897 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.471841 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 07:10:50.471897 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.471860 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:10:50.472144 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.472127 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:10:50.472698 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.472686 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:10:50.473861 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.473850 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:10:50.473982 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.473971 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 07:10:50.476355 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.476343 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 21 07:10:50.476409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.476382 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 07:10:50.476409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.476399 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 07:10:50.476506 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.476414 2567 kubelet.go:397] "Adding apiserver pod source" Apr 21 07:10:50.476506 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.476428 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 07:10:50.477580 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.477567 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:10:50.477649 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.477588 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:10:50.481819 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.481775 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 07:10:50.483485 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.483471 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 07:10:50.485553 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485540 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485558 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485564 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485570 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485575 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485581 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485587 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485592 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485599 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485604 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 07:10:50.485616 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485618 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 07:10:50.485851 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.485627 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 07:10:50.486514 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.486503 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 07:10:50.486514 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.486514 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 07:10:50.490140 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.490126 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 07:10:50.490203 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.490160 2567 server.go:1295] "Started kubelet" Apr 21 07:10:50.490301 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.490252 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 07:10:50.490338 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.490327 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 07:10:50.490632 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.490247 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 07:10:50.491035 ip-10-0-131-184 systemd[1]: Started Kubernetes Kubelet. Apr 21 07:10:50.491643 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.491620 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 07:10:50.493334 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.493320 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 21 07:10:50.498322 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.498304 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-184.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 07:10:50.498405 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.498348 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 07:10:50.498405 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.498360 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-184.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 07:10:50.498632 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.498614 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 07:10:50.498741 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.498711 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 07:10:50.499659 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.499560 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 07:10:50.499659 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.499580 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 07:10:50.499659 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.499578 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:50.499659 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.499649 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 21 07:10:50.499659 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.499657 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 21 07:10:50.499879 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.498627 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-184.ec2.internal.18a84dab27342e06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-184.ec2.internal,UID:ip-10-0-131-184.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-184.ec2.internal,},FirstTimestamp:2026-04-21 07:10:50.490138118 +0000 UTC m=+0.458514343,LastTimestamp:2026-04-21 07:10:50.490138118 +0000 UTC m=+0.458514343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-184.ec2.internal,}" Apr 21 07:10:50.500143 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.500057 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 07:10:50.500143 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.500088 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 07:10:50.500479 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.500465 2567 factory.go:55] Registering systemd factory Apr 21 07:10:50.500536 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.500485 2567 factory.go:223] Registration of the systemd container factory successfully Apr 21 07:10:50.500709 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.500695 2567 factory.go:153] Registering CRI-O factory Apr 21 07:10:50.500709 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.500711 2567 factory.go:223] Registration of the crio container factory successfully Apr 21 07:10:50.500839 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.500758 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 07:10:50.500839 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.500781 2567 factory.go:103] Registering Raw factory Apr 21 07:10:50.500839 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.500795 2567 manager.go:1196] Started watching for new ooms in manager Apr 21 07:10:50.501259 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.501244 2567 manager.go:319] Starting recovery of all containers Apr 21 07:10:50.502869 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.502842 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 07:10:50.503391 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.503332 2567 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-184.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 07:10:50.507021 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.506865 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 07:10:50.510338 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.510319 2567 manager.go:324] Recovery completed Apr 21 07:10:50.515549 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.515534 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:10:50.517987 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.517972 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:10:50.518046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.517999 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:10:50.518046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.518009 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:10:50.518475 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.518460 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 07:10:50.518475 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.518472 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 07:10:50.518581 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.518487 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:10:50.521296 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.521284 2567 policy_none.go:49] "None policy: Start" Apr 21 07:10:50.521340 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.521300 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 07:10:50.521340 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.521310 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 21 07:10:50.538338 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.538260 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-184.ec2.internal.18a84dab28dd1d2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-184.ec2.internal,UID:ip-10-0-131-184.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-184.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-184.ec2.internal,},FirstTimestamp:2026-04-21 07:10:50.517986604 +0000 UTC m=+0.486362829,LastTimestamp:2026-04-21 07:10:50.517986604 +0000 UTC m=+0.486362829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-184.ec2.internal,}" Apr 21 07:10:50.557961 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.557896 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-184.ec2.internal.18a84dab28dd6087 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-184.ec2.internal,UID:ip-10-0-131-184.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-184.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-184.ec2.internal,},FirstTimestamp:2026-04-21 07:10:50.518003847 +0000 UTC m=+0.486380073,LastTimestamp:2026-04-21 07:10:50.518003847 +0000 UTC m=+0.486380073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-184.ec2.internal,}" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.559062 2567 manager.go:341] "Starting Device Plugin manager" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.559100 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.559113 2567 server.go:85] "Starting device plugin registration server" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.559339 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.559350 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.559439 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.559536 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.559545 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.559999 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 07:10:50.566154 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.560044 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:50.567915 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.567851 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-184.ec2.internal.18a84dab28dd826b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-184.ec2.internal,UID:ip-10-0-131-184.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-184.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-184.ec2.internal,},FirstTimestamp:2026-04-21 07:10:50.518012523 +0000 UTC m=+0.486388749,LastTimestamp:2026-04-21 07:10:50.518012523 +0000 UTC m=+0.486388749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-184.ec2.internal,}" Apr 21 07:10:50.583478 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.583408 2567 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-184.ec2.internal.18a84dab2b876ac1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-184.ec2.internal,UID:ip-10-0-131-184.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-131-184.ec2.internal,},FirstTimestamp:2026-04-21 07:10:50.562702017 +0000 UTC m=+0.531078241,LastTimestamp:2026-04-21 07:10:50.562702017 +0000 UTC m=+0.531078241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-184.ec2.internal,}" Apr 21 07:10:50.622388 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.622369 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9zdpr" Apr 21 07:10:50.630390 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.630349 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 07:10:50.630390 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.630377 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 07:10:50.630476 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.630394 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 07:10:50.630476 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.630403 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 07:10:50.630476 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.630429 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 07:10:50.633482 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.633457 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9zdpr" Apr 21 07:10:50.660000 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.659983 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:10:50.660899 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.660884 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:10:50.660968 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.660917 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:10:50.660968 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.660933 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:10:50.660968 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.660960 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.668226 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.668212 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:10:50.681094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.681078 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.681137 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.681101 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-184.ec2.internal\": node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:50.730838 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.730818 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal"] Apr 21 07:10:50.730942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.730880 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:10:50.732475 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.732457 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:10:50.732587 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.732490 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:10:50.732587 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.732505 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:10:50.733844 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.733830 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:10:50.733996 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.733983 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.734042 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.734010 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:10:50.734494 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.734480 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:10:50.734586 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.734507 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:10:50.734586 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.734484 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:10:50.734586 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.734558 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:10:50.734586 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.734575 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:10:50.734757 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.734537 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:10:50.735640 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.735627 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.735689 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.735651 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:10:50.736275 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.736262 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:10:50.736338 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.736284 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:10:50.736338 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.736304 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:10:50.750024 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.750005 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-184.ec2.internal\" not found" node="ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.753650 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.753636 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-184.ec2.internal\" not found" node="ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.789123 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.789103 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:50.801612 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.801594 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bcfcf54b7f1b4b651ff33830189a3fdf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal\" (UID: \"bcfcf54b7f1b4b651ff33830189a3fdf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.801654 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.801624 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcfcf54b7f1b4b651ff33830189a3fdf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal\" (UID: \"bcfcf54b7f1b4b651ff33830189a3fdf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.801654 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.801641 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3b2b2976bec5eff564002b454ef52b93-config\") pod \"kube-apiserver-proxy-ip-10-0-131-184.ec2.internal\" (UID: \"3b2b2976bec5eff564002b454ef52b93\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.890164 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.890087 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:50.902529 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.902503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bcfcf54b7f1b4b651ff33830189a3fdf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal\" (UID: \"bcfcf54b7f1b4b651ff33830189a3fdf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.902645 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.902565 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcfcf54b7f1b4b651ff33830189a3fdf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal\" (UID: \"bcfcf54b7f1b4b651ff33830189a3fdf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.902645 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.902592 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bcfcf54b7f1b4b651ff33830189a3fdf-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal\" (UID: \"bcfcf54b7f1b4b651ff33830189a3fdf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.902645 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.902622 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3b2b2976bec5eff564002b454ef52b93-config\") pod \"kube-apiserver-proxy-ip-10-0-131-184.ec2.internal\" (UID: \"3b2b2976bec5eff564002b454ef52b93\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.902645 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.902594 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3b2b2976bec5eff564002b454ef52b93-config\") pod \"kube-apiserver-proxy-ip-10-0-131-184.ec2.internal\" (UID: \"3b2b2976bec5eff564002b454ef52b93\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.902775 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:50.902662 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcfcf54b7f1b4b651ff33830189a3fdf-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal\" (UID: \"bcfcf54b7f1b4b651ff33830189a3fdf\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" Apr 21 07:10:50.990208 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:50.990161 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.051867 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.051840 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" Apr 21 07:10:51.056432 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.056415 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal" Apr 21 07:10:51.091032 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:51.091003 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.191658 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:51.191582 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.292118 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:51.292085 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.392766 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:51.392736 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.396921 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.396902 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 07:10:51.397046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.397031 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:10:51.459937 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.459881 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:10:51.489275 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.489254 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:10:51.493317 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:51.493298 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.496866 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:51.496826 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcfcf54b7f1b4b651ff33830189a3fdf.slice/crio-d3da1be4bcbc65cc79bc007c29c036a5f8b9d49c93b0e546e1b76adfbd7836e2 WatchSource:0}: Error finding container d3da1be4bcbc65cc79bc007c29c036a5f8b9d49c93b0e546e1b76adfbd7836e2: Status 404 returned error can't find the container with id d3da1be4bcbc65cc79bc007c29c036a5f8b9d49c93b0e546e1b76adfbd7836e2 Apr 21 07:10:51.497496 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:51.497477 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b2b2976bec5eff564002b454ef52b93.slice/crio-7a990a6774bb5fdb661e2c98488a74b0422f85edc4abe40c787318cafa85c8ec WatchSource:0}: Error finding container 7a990a6774bb5fdb661e2c98488a74b0422f85edc4abe40c787318cafa85c8ec: Status 404 returned error can't find the container with id 7a990a6774bb5fdb661e2c98488a74b0422f85edc4abe40c787318cafa85c8ec Apr 21 07:10:51.499641 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.499628 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 07:10:51.501888 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.501873 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:10:51.521644 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.521620 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:10:51.593949 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:51.593918 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.605856 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.605835 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jf5mr" Apr 21 07:10:51.624498 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.624479 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jf5mr" Apr 21 07:10:51.633493 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.633456 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal" event={"ID":"3b2b2976bec5eff564002b454ef52b93","Type":"ContainerStarted","Data":"7a990a6774bb5fdb661e2c98488a74b0422f85edc4abe40c787318cafa85c8ec"} Apr 21 07:10:51.634422 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.634403 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" event={"ID":"bcfcf54b7f1b4b651ff33830189a3fdf","Type":"ContainerStarted","Data":"d3da1be4bcbc65cc79bc007c29c036a5f8b9d49c93b0e546e1b76adfbd7836e2"} Apr 21 07:10:51.637568 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.637515 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 07:05:50 +0000 UTC" deadline="2027-10-25 23:31:33.085910772 +0000 UTC" Apr 21 07:10:51.637568 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.637568 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13264h20m41.4483453s" Apr 21 07:10:51.694679 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:51.694661 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.795274 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:51.795203 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.895945 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:51.895899 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-184.ec2.internal\" not found" Apr 21 07:10:51.900566 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.900543 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:10:51.999336 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:51.999313 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" Apr 21 07:10:52.027696 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.027674 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:10:52.027839 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.027812 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal" Apr 21 07:10:52.040333 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.040306 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:10:52.477598 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.477562 2567 apiserver.go:52] "Watching apiserver" Apr 21 07:10:52.485167 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.485140 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 07:10:52.485515 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.485493 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-v524f","openshift-cluster-node-tuning-operator/tuned-vxq2w","openshift-image-registry/node-ca-7vkh9","openshift-multus/multus-additional-cni-plugins-9t6vk","openshift-network-diagnostics/network-check-target-fsfmp","openshift-network-operator/iptables-alerter-5hf89","kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48","openshift-dns/node-resolver-qzpkj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal","openshift-multus/multus-bs4gw","openshift-multus/network-metrics-daemon-xxrlv","openshift-ovn-kubernetes/ovnkube-node-pbdvz"] Apr 21 07:10:52.487773 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.487753 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:52.487869 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:52.487835 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:10:52.488789 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.488770 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.489967 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.489948 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.491314 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.491294 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.492620 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.492602 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.494027 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.494008 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v524f" Apr 21 07:10:52.495338 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.495320 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:10:52.495427 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:52.495401 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:10:52.496827 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.496812 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.498385 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.498368 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.498848 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.498610 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 07:10:52.498848 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.498803 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5zf2r\"" Apr 21 07:10:52.499068 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.499016 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 07:10:52.499123 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.499110 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:10:52.499247 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.499216 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sw4cm\"" Apr 21 07:10:52.499503 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.499485 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 07:10:52.499645 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.499615 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 07:10:52.500410 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.500391 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.502330 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.502311 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.502741 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.502726 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 07:10:52.503129 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.503107 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 07:10:52.503220 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.503113 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 07:10:52.503220 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.503150 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gwqpq\"" Apr 21 07:10:52.503785 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.503463 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 07:10:52.503785 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.503671 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 07:10:52.503785 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.503704 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 07:10:52.503785 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.503713 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:10:52.504012 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.503996 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 07:10:52.504079 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.504045 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 07:10:52.504079 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.504054 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gwksx\"" Apr 21 07:10:52.504197 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.504179 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 07:10:52.504280 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.504229 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 07:10:52.505406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.505217 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 07:10:52.505406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.505245 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 07:10:52.506371 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.506349 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 07:10:52.508307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.508287 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qpdtf\"" Apr 21 07:10:52.508441 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.508418 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 07:10:52.508441 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.508437 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7v8lt\"" Apr 21 07:10:52.508758 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.508426 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 07:10:52.508758 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.508494 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 07:10:52.508758 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.508502 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vhdzg\"" Apr 21 07:10:52.508918 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.508796 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 07:10:52.509396 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.509236 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-z6d2x\"" Apr 21 07:10:52.509396 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.509262 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 07:10:52.509396 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.509276 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4l8sd\"" Apr 21 07:10:52.509396 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.509341 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 07:10:52.509396 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.509342 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 07:10:52.509752 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.509602 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 07:10:52.510860 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.510835 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c395523b-6f94-447f-a14f-b3e86618c396-ovn-node-metrics-cert\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.510944 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.510872 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca549d86-e91c-4488-bfac-cf936e205050-cni-binary-copy\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.510944 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.510898 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ca549d86-e91c-4488-bfac-cf936e205050-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.510944 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.510926 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-device-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.511086 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.510950 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4d6c\" (UniqueName: \"kubernetes.io/projected/c395523b-6f94-447f-a14f-b3e86618c396-kube-api-access-j4d6c\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.511086 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.510972 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8105a8f5-e174-49e3-ba2e-c9e8b7d649a4-serviceca\") pod \"node-ca-7vkh9\" (UID: \"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4\") " pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.511086 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511000 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.511086 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511051 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-cni-bin\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.511253 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511093 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-etc-openvswitch\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.511253 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511120 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-cni-netd\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.511253 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511148 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-sysctl-conf\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.511253 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511171 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c0141af-1317-4665-bd56-7841a1731312-tmp\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.511253 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511194 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r59bg\" (UniqueName: \"kubernetes.io/projected/6d1ac31b-8866-4817-8119-87e810a0da44-kube-api-access-r59bg\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:52.511253 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511218 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8105a8f5-e174-49e3-ba2e-c9e8b7d649a4-host\") pod \"node-ca-7vkh9\" (UID: \"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4\") " pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.511253 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511238 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8352266b-7f87-4b49-9222-1a7518a8bda8-host-slash\") pod \"iptables-alerter-5hf89\" (UID: \"8352266b-7f87-4b49-9222-1a7518a8bda8\") " pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511260 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-registration-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511283 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-etc-selinux\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511321 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/64fd83cc-7ef6-4cb9-892b-0111cac9771d-tmp-dir\") pod \"node-resolver-qzpkj\" (UID: \"64fd83cc-7ef6-4cb9-892b-0111cac9771d\") " pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511345 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-sysctl-d\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511365 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-sys\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511387 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-var-lib-openvswitch\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c395523b-6f94-447f-a14f-b3e86618c396-env-overrides\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511434 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-var-lib-kubelet\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511475 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml28q\" (UniqueName: \"kubernetes.io/projected/8352266b-7f87-4b49-9222-1a7518a8bda8-kube-api-access-ml28q\") pod \"iptables-alerter-5hf89\" (UID: \"8352266b-7f87-4b49-9222-1a7518a8bda8\") " pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511497 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-kubelet\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511540 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-node-log\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511563 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg2wx\" (UniqueName: \"kubernetes.io/projected/7c0141af-1317-4665-bd56-7841a1731312-kube-api-access-pg2wx\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.511582 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511584 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-system-cni-dir\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511608 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdw9\" (UniqueName: \"kubernetes.io/projected/b044312b-3805-4344-992b-7e7befb3d7f3-kube-api-access-kvdw9\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511629 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-run-netns\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511651 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/64fd83cc-7ef6-4cb9-892b-0111cac9771d-hosts-file\") pod \"node-resolver-qzpkj\" (UID: \"64fd83cc-7ef6-4cb9-892b-0111cac9771d\") " pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511690 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-slash\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511720 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-run-openvswitch\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511743 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-modprobe-d\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511765 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-kubernetes\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511795 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-run\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511820 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-systemd-units\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511841 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-log-socket\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511870 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhkjh\" (UniqueName: \"kubernetes.io/projected/ca549d86-e91c-4488-bfac-cf936e205050-kube-api-access-bhkjh\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-socket-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511914 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzdrp\" (UniqueName: \"kubernetes.io/projected/64fd83cc-7ef6-4cb9-892b-0111cac9771d-kube-api-access-gzdrp\") pod \"node-resolver-qzpkj\" (UID: \"64fd83cc-7ef6-4cb9-892b-0111cac9771d\") " pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511936 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-run-systemd\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511956 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-run-ovn\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.512093 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.511979 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c395523b-6f94-447f-a14f-b3e86618c396-ovnkube-script-lib\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512017 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-sysconfig\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512046 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-cnibin\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512067 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-os-release\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512088 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ca549d86-e91c-4488-bfac-cf936e205050-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512104 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512133 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c395523b-6f94-447f-a14f-b3e86618c396-ovnkube-config\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512152 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/476caf85-7f49-4bd9-944d-7dd2e7975a87-agent-certs\") pod \"konnectivity-agent-v524f\" (UID: \"476caf85-7f49-4bd9-944d-7dd2e7975a87\") " pod="kube-system/konnectivity-agent-v524f" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512167 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512180 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zchz9\" (UniqueName: \"kubernetes.io/projected/8105a8f5-e174-49e3-ba2e-c9e8b7d649a4-kube-api-access-zchz9\") pod \"node-ca-7vkh9\" (UID: \"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4\") " pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512219 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-sys-fs\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512250 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-host\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512282 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8352266b-7f87-4b49-9222-1a7518a8bda8-iptables-alerter-script\") pod \"iptables-alerter-5hf89\" (UID: \"8352266b-7f87-4b49-9222-1a7518a8bda8\") " pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512312 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-systemd\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512357 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7c0141af-1317-4665-bd56-7841a1731312-etc-tuned\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512382 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-lib-modules\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.512767 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512409 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:52.513593 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512430 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/476caf85-7f49-4bd9-944d-7dd2e7975a87-konnectivity-ca\") pod \"konnectivity-agent-v524f\" (UID: \"476caf85-7f49-4bd9-944d-7dd2e7975a87\") " pod="kube-system/konnectivity-agent-v524f" Apr 21 07:10:52.513593 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.512446 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.600990 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.600964 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 07:10:52.612821 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.612795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/476caf85-7f49-4bd9-944d-7dd2e7975a87-agent-certs\") pod \"konnectivity-agent-v524f\" (UID: \"476caf85-7f49-4bd9-944d-7dd2e7975a87\") " pod="kube-system/konnectivity-agent-v524f" Apr 21 07:10:52.612821 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.612825 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.612989 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.612850 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-run-netns\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.612989 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.612873 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-var-lib-cni-bin\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.612989 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.612894 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-hostroot\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.612989 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.612932 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zchz9\" (UniqueName: \"kubernetes.io/projected/8105a8f5-e174-49e3-ba2e-c9e8b7d649a4-kube-api-access-zchz9\") pod \"node-ca-7vkh9\" (UID: \"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4\") " pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.612989 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.612955 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-sys-fs\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.612989 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.612978 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-host\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613005 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/918e28f2-6377-405c-885f-92621fe803a0-cni-binary-copy\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613015 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613030 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8352266b-7f87-4b49-9222-1a7518a8bda8-iptables-alerter-script\") pod \"iptables-alerter-5hf89\" (UID: \"8352266b-7f87-4b49-9222-1a7518a8bda8\") " pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613058 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-sys-fs\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613070 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-systemd\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613112 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-systemd\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613111 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-system-cni-dir\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613158 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-host\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613189 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc2lk\" (UniqueName: \"kubernetes.io/projected/918e28f2-6377-405c-885f-92621fe803a0-kube-api-access-mc2lk\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613166 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 07:10:52.613231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613216 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7c0141af-1317-4665-bd56-7841a1731312-etc-tuned\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613240 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-lib-modules\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613265 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613310 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/476caf85-7f49-4bd9-944d-7dd2e7975a87-konnectivity-ca\") pod \"konnectivity-agent-v524f\" (UID: \"476caf85-7f49-4bd9-944d-7dd2e7975a87\") " pod="kube-system/konnectivity-agent-v524f" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613389 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-lib-modules\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:52.613400 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:52.613463 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs podName:6d1ac31b-8866-4817-8119-87e810a0da44 nodeName:}" failed. No retries permitted until 2026-04-21 07:10:53.113443531 +0000 UTC m=+3.081819749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs") pod "network-metrics-daemon-xxrlv" (UID: "6d1ac31b-8866-4817-8119-87e810a0da44") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613502 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8352266b-7f87-4b49-9222-1a7518a8bda8-iptables-alerter-script\") pod \"iptables-alerter-5hf89\" (UID: \"8352266b-7f87-4b49-9222-1a7518a8bda8\") " pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613501 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613598 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c395523b-6f94-447f-a14f-b3e86618c396-ovn-node-metrics-cert\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613628 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca549d86-e91c-4488-bfac-cf936e205050-cni-binary-copy\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613653 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ca549d86-e91c-4488-bfac-cf936e205050-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613682 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-device-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.613697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4d6c\" (UniqueName: \"kubernetes.io/projected/c395523b-6f94-447f-a14f-b3e86618c396-kube-api-access-j4d6c\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613728 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-multus-conf-dir\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613750 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/918e28f2-6377-405c-885f-92621fe803a0-multus-daemon-config\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613773 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8105a8f5-e174-49e3-ba2e-c9e8b7d649a4-serviceca\") pod \"node-ca-7vkh9\" (UID: \"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4\") " pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613793 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613817 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/476caf85-7f49-4bd9-944d-7dd2e7975a87-konnectivity-ca\") pod \"konnectivity-agent-v524f\" (UID: \"476caf85-7f49-4bd9-944d-7dd2e7975a87\") " pod="kube-system/konnectivity-agent-v524f" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613815 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-cni-bin\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613850 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-cnibin\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613870 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-os-release\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613894 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-run-k8s-cni-cncf-io\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613915 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-var-lib-cni-multus\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613939 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-etc-openvswitch\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613973 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-cni-netd\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.613994 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-sysctl-conf\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614013 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c0141af-1317-4665-bd56-7841a1731312-tmp\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614034 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r59bg\" (UniqueName: \"kubernetes.io/projected/6d1ac31b-8866-4817-8119-87e810a0da44-kube-api-access-r59bg\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614053 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8105a8f5-e174-49e3-ba2e-c9e8b7d649a4-host\") pod \"node-ca-7vkh9\" (UID: \"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4\") " pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.614324 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614072 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8352266b-7f87-4b49-9222-1a7518a8bda8-host-slash\") pod \"iptables-alerter-5hf89\" (UID: \"8352266b-7f87-4b49-9222-1a7518a8bda8\") " pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614099 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-registration-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-etc-selinux\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614147 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/64fd83cc-7ef6-4cb9-892b-0111cac9771d-tmp-dir\") pod \"node-resolver-qzpkj\" (UID: \"64fd83cc-7ef6-4cb9-892b-0111cac9771d\") " pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614171 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-sysctl-d\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614191 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-sys\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614192 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ca549d86-e91c-4488-bfac-cf936e205050-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614216 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-var-lib-openvswitch\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614266 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-var-lib-openvswitch\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614287 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8105a8f5-e174-49e3-ba2e-c9e8b7d649a4-serviceca\") pod \"node-ca-7vkh9\" (UID: \"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4\") " pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614328 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-sysctl-conf\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614360 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c395523b-6f94-447f-a14f-b3e86618c396-env-overrides\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614587 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8105a8f5-e174-49e3-ba2e-c9e8b7d649a4-host\") pod \"node-ca-7vkh9\" (UID: \"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4\") " pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614626 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8352266b-7f87-4b49-9222-1a7518a8bda8-host-slash\") pod \"iptables-alerter-5hf89\" (UID: \"8352266b-7f87-4b49-9222-1a7518a8bda8\") " pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-var-lib-kubelet\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614690 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-multus-cni-dir\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614703 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-device-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.615148 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614717 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-run-multus-certs\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614750 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614779 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml28q\" (UniqueName: \"kubernetes.io/projected/8352266b-7f87-4b49-9222-1a7518a8bda8-kube-api-access-ml28q\") pod \"iptables-alerter-5hf89\" (UID: \"8352266b-7f87-4b49-9222-1a7518a8bda8\") " pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614811 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-kubelet\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614847 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-node-log\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614874 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg2wx\" (UniqueName: \"kubernetes.io/projected/7c0141af-1317-4665-bd56-7841a1731312-kube-api-access-pg2wx\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614887 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca549d86-e91c-4488-bfac-cf936e205050-cni-binary-copy\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614898 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-system-cni-dir\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614926 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-var-lib-kubelet\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614752 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-etc-openvswitch\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.614957 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-registration-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615005 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-system-cni-dir\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615020 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-cni-netd\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615113 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-etc-selinux\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615166 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c395523b-6f94-447f-a14f-b3e86618c396-env-overrides\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615230 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-var-lib-kubelet\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615231 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-sys\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.615912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615273 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdw9\" (UniqueName: \"kubernetes.io/projected/b044312b-3805-4344-992b-7e7befb3d7f3-kube-api-access-kvdw9\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615304 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-run-netns\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615332 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/64fd83cc-7ef6-4cb9-892b-0111cac9771d-hosts-file\") pod \"node-resolver-qzpkj\" (UID: \"64fd83cc-7ef6-4cb9-892b-0111cac9771d\") " pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615330 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/64fd83cc-7ef6-4cb9-892b-0111cac9771d-tmp-dir\") pod \"node-resolver-qzpkj\" (UID: \"64fd83cc-7ef6-4cb9-892b-0111cac9771d\") " pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615425 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-cni-bin\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615454 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/64fd83cc-7ef6-4cb9-892b-0111cac9771d-hosts-file\") pod \"node-resolver-qzpkj\" (UID: \"64fd83cc-7ef6-4cb9-892b-0111cac9771d\") " pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615467 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-node-log\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-slash\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615512 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-run-openvswitch\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615560 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-modprobe-d\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615586 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-kubernetes\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615610 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-run\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615667 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-systemd-units\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615671 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-run-openvswitch\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615694 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-log-socket\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615714 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-run-netns\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615720 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhkjh\" (UniqueName: \"kubernetes.io/projected/ca549d86-e91c-4488-bfac-cf936e205050-kube-api-access-bhkjh\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615746 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-socket-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.616665 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615756 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-run\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615778 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-multus-socket-dir-parent\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615807 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzdrp\" (UniqueName: \"kubernetes.io/projected/64fd83cc-7ef6-4cb9-892b-0111cac9771d-kube-api-access-gzdrp\") pod \"node-resolver-qzpkj\" (UID: \"64fd83cc-7ef6-4cb9-892b-0111cac9771d\") " pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615835 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-run-systemd\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615927 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-run-ovn\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615433 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-sysctl-d\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615956 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c395523b-6f94-447f-a14f-b3e86618c396-ovnkube-script-lib\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615967 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-run-ovn-kubernetes\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615983 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-sysconfig\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.615989 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-kubelet\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616014 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-log-socket\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616016 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-cnibin\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616037 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-sysconfig\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616043 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-cnibin\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616058 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-os-release\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ca549d86-e91c-4488-bfac-cf936e205050-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616119 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.617409 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616126 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-socket-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c395523b-6f94-447f-a14f-b3e86618c396-ovnkube-config\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca549d86-e91c-4488-bfac-cf936e205050-os-release\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616253 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-kubernetes\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616271 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b044312b-3805-4344-992b-7e7befb3d7f3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616291 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-run-ovn\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616344 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7c0141af-1317-4665-bd56-7841a1731312-etc-modprobe-d\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-etc-kubernetes\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616466 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-run-systemd\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616474 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-systemd-units\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616516 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c395523b-6f94-447f-a14f-b3e86618c396-host-slash\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616776 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c395523b-6f94-447f-a14f-b3e86618c396-ovnkube-script-lib\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616925 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ca549d86-e91c-4488-bfac-cf936e205050-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.616994 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c395523b-6f94-447f-a14f-b3e86618c396-ovnkube-config\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.617598 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7c0141af-1317-4665-bd56-7841a1731312-etc-tuned\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.617629 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c0141af-1317-4665-bd56-7841a1731312-tmp\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.618094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.617693 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/476caf85-7f49-4bd9-944d-7dd2e7975a87-agent-certs\") pod \"konnectivity-agent-v524f\" (UID: \"476caf85-7f49-4bd9-944d-7dd2e7975a87\") " pod="kube-system/konnectivity-agent-v524f" Apr 21 07:10:52.619117 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.619099 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c395523b-6f94-447f-a14f-b3e86618c396-ovn-node-metrics-cert\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.625680 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.625657 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:05:51 +0000 UTC" deadline="2028-01-22 20:28:26.05667165 +0000 UTC" Apr 21 07:10:52.625680 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.625676 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15397h17m33.430997419s" Apr 21 07:10:52.634805 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:52.634782 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:10:52.634805 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:52.634806 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:10:52.634951 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:52.634816 2567 projected.go:194] Error preparing data for projected volume kube-api-access-hz4vs for pod openshift-network-diagnostics/network-check-target-fsfmp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:10:52.634951 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:52.634880 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs podName:39152450-b5d7-466f-b0a7-58dad042db38 nodeName:}" failed. No retries permitted until 2026-04-21 07:10:53.134860828 +0000 UTC m=+3.103237041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hz4vs" (UniqueName: "kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs") pod "network-check-target-fsfmp" (UID: "39152450-b5d7-466f-b0a7-58dad042db38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:10:52.639169 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.639138 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml28q\" (UniqueName: \"kubernetes.io/projected/8352266b-7f87-4b49-9222-1a7518a8bda8-kube-api-access-ml28q\") pod \"iptables-alerter-5hf89\" (UID: \"8352266b-7f87-4b49-9222-1a7518a8bda8\") " pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.649788 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.648852 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zchz9\" (UniqueName: \"kubernetes.io/projected/8105a8f5-e174-49e3-ba2e-c9e8b7d649a4-kube-api-access-zchz9\") pod \"node-ca-7vkh9\" (UID: \"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4\") " pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.649788 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.649092 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzdrp\" (UniqueName: \"kubernetes.io/projected/64fd83cc-7ef6-4cb9-892b-0111cac9771d-kube-api-access-gzdrp\") pod \"node-resolver-qzpkj\" (UID: \"64fd83cc-7ef6-4cb9-892b-0111cac9771d\") " pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.650802 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.650783 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg2wx\" (UniqueName: \"kubernetes.io/projected/7c0141af-1317-4665-bd56-7841a1731312-kube-api-access-pg2wx\") pod \"tuned-vxq2w\" (UID: \"7c0141af-1317-4665-bd56-7841a1731312\") " pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.653191 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.653170 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4d6c\" (UniqueName: \"kubernetes.io/projected/c395523b-6f94-447f-a14f-b3e86618c396-kube-api-access-j4d6c\") pod \"ovnkube-node-pbdvz\" (UID: \"c395523b-6f94-447f-a14f-b3e86618c396\") " pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.658614 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.658594 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhkjh\" (UniqueName: \"kubernetes.io/projected/ca549d86-e91c-4488-bfac-cf936e205050-kube-api-access-bhkjh\") pod \"multus-additional-cni-plugins-9t6vk\" (UID: \"ca549d86-e91c-4488-bfac-cf936e205050\") " pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.659820 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.659800 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r59bg\" (UniqueName: \"kubernetes.io/projected/6d1ac31b-8866-4817-8119-87e810a0da44-kube-api-access-r59bg\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:52.660814 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.660794 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdw9\" (UniqueName: \"kubernetes.io/projected/b044312b-3805-4344-992b-7e7befb3d7f3-kube-api-access-kvdw9\") pod \"aws-ebs-csi-driver-node-gnx48\" (UID: \"b044312b-3805-4344-992b-7e7befb3d7f3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.717058 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717020 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-etc-kubernetes\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717187 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717065 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-run-netns\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717187 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-var-lib-cni-bin\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717187 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717108 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-hostroot\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717187 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717118 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-etc-kubernetes\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717187 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717119 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-run-netns\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717187 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717125 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/918e28f2-6377-405c-885f-92621fe803a0-cni-binary-copy\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717187 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717168 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-var-lib-cni-bin\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717446 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717217 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-hostroot\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717446 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717262 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-system-cni-dir\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717446 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717303 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mc2lk\" (UniqueName: \"kubernetes.io/projected/918e28f2-6377-405c-885f-92621fe803a0-kube-api-access-mc2lk\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717446 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717304 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-system-cni-dir\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717446 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717354 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-multus-conf-dir\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717446 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717398 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-multus-conf-dir\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717450 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/918e28f2-6377-405c-885f-92621fe803a0-multus-daemon-config\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-cnibin\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717514 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-os-release\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717551 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-run-k8s-cni-cncf-io\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717584 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-var-lib-cni-multus\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717608 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-cnibin\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717621 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-multus-cni-dir\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717647 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-run-multus-certs\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717662 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-var-lib-cni-multus\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717669 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-os-release\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717688 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-var-lib-kubelet\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717691 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/918e28f2-6377-405c-885f-92621fe803a0-cni-binary-copy\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717693 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-run-k8s-cni-cncf-io\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717717 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-multus-cni-dir\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717719 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-run-multus-certs\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717734 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-multus-socket-dir-parent\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.717746 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717755 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-host-var-lib-kubelet\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.718398 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.717807 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/918e28f2-6377-405c-885f-92621fe803a0-multus-socket-dir-parent\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.718398 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.718003 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/918e28f2-6377-405c-885f-92621fe803a0-multus-daemon-config\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.729548 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.729479 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc2lk\" (UniqueName: \"kubernetes.io/projected/918e28f2-6377-405c-885f-92621fe803a0-kube-api-access-mc2lk\") pod \"multus-bs4gw\" (UID: \"918e28f2-6377-405c-885f-92621fe803a0\") " pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.799460 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.799432 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" Apr 21 07:10:52.808503 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.808477 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7vkh9" Apr 21 07:10:52.817401 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.817384 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" Apr 21 07:10:52.823999 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.823982 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5hf89" Apr 21 07:10:52.830921 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.830902 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v524f" Apr 21 07:10:52.836517 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.836496 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" Apr 21 07:10:52.843064 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.843037 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzpkj" Apr 21 07:10:52.849702 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.849681 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:10:52.856241 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.856220 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bs4gw" Apr 21 07:10:52.872393 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:52.872378 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:10:53.106885 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:53.106850 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb044312b_3805_4344_992b_7e7befb3d7f3.slice/crio-241d7492e6e3f7787de70ebf8186394a630f5379fb859292270804c2a84668e4 WatchSource:0}: Error finding container 241d7492e6e3f7787de70ebf8186394a630f5379fb859292270804c2a84668e4: Status 404 returned error can't find the container with id 241d7492e6e3f7787de70ebf8186394a630f5379fb859292270804c2a84668e4 Apr 21 07:10:53.107435 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:53.107409 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c0141af_1317_4665_bd56_7841a1731312.slice/crio-ede82e10401e14768b2f4027033e6323380c89f3ab079bba10757290a72fd762 WatchSource:0}: Error finding container ede82e10401e14768b2f4027033e6323380c89f3ab079bba10757290a72fd762: Status 404 returned error can't find the container with id ede82e10401e14768b2f4027033e6323380c89f3ab079bba10757290a72fd762 Apr 21 07:10:53.110300 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:53.110180 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64fd83cc_7ef6_4cb9_892b_0111cac9771d.slice/crio-a0bd90ad7d635ecd841097647c3402de2865e01c8459c776b82cc4e44e497cbf WatchSource:0}: Error finding container a0bd90ad7d635ecd841097647c3402de2865e01c8459c776b82cc4e44e497cbf: Status 404 returned error can't find the container with id a0bd90ad7d635ecd841097647c3402de2865e01c8459c776b82cc4e44e497cbf Apr 21 07:10:53.112277 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:53.112257 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod918e28f2_6377_405c_885f_92621fe803a0.slice/crio-70ba956f9141fbd8c8fdc588fbbec3cafafd5bf1d1a95f6831de94e68f9c3208 WatchSource:0}: Error finding container 70ba956f9141fbd8c8fdc588fbbec3cafafd5bf1d1a95f6831de94e68f9c3208: Status 404 returned error can't find the container with id 70ba956f9141fbd8c8fdc588fbbec3cafafd5bf1d1a95f6831de94e68f9c3208 Apr 21 07:10:53.114692 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:53.114668 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod476caf85_7f49_4bd9_944d_7dd2e7975a87.slice/crio-d86fddb49558985cf01dfd59f1bbe13ccc09c1124c3402140915b6c5b643a48c WatchSource:0}: Error finding container d86fddb49558985cf01dfd59f1bbe13ccc09c1124c3402140915b6c5b643a48c: Status 404 returned error can't find the container with id d86fddb49558985cf01dfd59f1bbe13ccc09c1124c3402140915b6c5b643a48c Apr 21 07:10:53.115397 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:53.115362 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc395523b_6f94_447f_a14f_b3e86618c396.slice/crio-0c45d2622b558f859b09bb89af20e4d0d6cb62c19ecc28220bbe876ebb5169fd WatchSource:0}: Error finding container 0c45d2622b558f859b09bb89af20e4d0d6cb62c19ecc28220bbe876ebb5169fd: Status 404 returned error can't find the container with id 0c45d2622b558f859b09bb89af20e4d0d6cb62c19ecc28220bbe876ebb5169fd Apr 21 07:10:53.116268 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:53.116248 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8352266b_7f87_4b49_9222_1a7518a8bda8.slice/crio-35e2e1f71c9aef28ae14d86279470311e6c7675f60c6e5ee751f0294ddc6930f WatchSource:0}: Error finding container 35e2e1f71c9aef28ae14d86279470311e6c7675f60c6e5ee751f0294ddc6930f: Status 404 returned error can't find the container with id 35e2e1f71c9aef28ae14d86279470311e6c7675f60c6e5ee751f0294ddc6930f Apr 21 07:10:53.117246 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:10:53.117228 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca549d86_e91c_4488_bfac_cf936e205050.slice/crio-04fc84647fea912af06d93c5136be00d6773641532eba501a706bf1ad53f5945 WatchSource:0}: Error finding container 04fc84647fea912af06d93c5136be00d6773641532eba501a706bf1ad53f5945: Status 404 returned error can't find the container with id 04fc84647fea912af06d93c5136be00d6773641532eba501a706bf1ad53f5945 Apr 21 07:10:53.121621 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.121598 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:53.121747 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:53.121731 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:10:53.121807 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:53.121797 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs podName:6d1ac31b-8866-4817-8119-87e810a0da44 nodeName:}" failed. No retries permitted until 2026-04-21 07:10:54.121783271 +0000 UTC m=+4.090159485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs") pod "network-metrics-daemon-xxrlv" (UID: "6d1ac31b-8866-4817-8119-87e810a0da44") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:10:53.222313 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.222282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:10:53.222410 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:53.222396 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:10:53.222454 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:53.222416 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:10:53.222454 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:53.222426 2567 projected.go:194] Error preparing data for projected volume kube-api-access-hz4vs for pod openshift-network-diagnostics/network-check-target-fsfmp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:10:53.222512 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:53.222464 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs podName:39152450-b5d7-466f-b0a7-58dad042db38 nodeName:}" failed. No retries permitted until 2026-04-21 07:10:54.222452014 +0000 UTC m=+4.190828227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hz4vs" (UniqueName: "kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs") pod "network-check-target-fsfmp" (UID: "39152450-b5d7-466f-b0a7-58dad042db38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:10:53.626023 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.625976 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:05:51 +0000 UTC" deadline="2028-01-28 23:36:41.638235588 +0000 UTC" Apr 21 07:10:53.626023 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.626013 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15544h25m48.012225601s" Apr 21 07:10:53.640762 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.640733 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" event={"ID":"b044312b-3805-4344-992b-7e7befb3d7f3","Type":"ContainerStarted","Data":"241d7492e6e3f7787de70ebf8186394a630f5379fb859292270804c2a84668e4"} Apr 21 07:10:53.643291 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.643231 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7vkh9" event={"ID":"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4","Type":"ContainerStarted","Data":"7f0388b0611c2658a8d6f24e9ee6532c77f2f0e354b31dd5fdf73f2bb5d7bc13"} Apr 21 07:10:53.646391 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.646104 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" event={"ID":"ca549d86-e91c-4488-bfac-cf936e205050","Type":"ContainerStarted","Data":"04fc84647fea912af06d93c5136be00d6773641532eba501a706bf1ad53f5945"} Apr 21 07:10:53.649366 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.649339 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" event={"ID":"c395523b-6f94-447f-a14f-b3e86618c396","Type":"ContainerStarted","Data":"0c45d2622b558f859b09bb89af20e4d0d6cb62c19ecc28220bbe876ebb5169fd"} Apr 21 07:10:53.654561 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.654243 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v524f" event={"ID":"476caf85-7f49-4bd9-944d-7dd2e7975a87","Type":"ContainerStarted","Data":"d86fddb49558985cf01dfd59f1bbe13ccc09c1124c3402140915b6c5b643a48c"} Apr 21 07:10:53.657026 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.656055 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzpkj" event={"ID":"64fd83cc-7ef6-4cb9-892b-0111cac9771d","Type":"ContainerStarted","Data":"a0bd90ad7d635ecd841097647c3402de2865e01c8459c776b82cc4e44e497cbf"} Apr 21 07:10:53.658343 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.658263 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" event={"ID":"7c0141af-1317-4665-bd56-7841a1731312","Type":"ContainerStarted","Data":"ede82e10401e14768b2f4027033e6323380c89f3ab079bba10757290a72fd762"} Apr 21 07:10:53.661376 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.660958 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal" event={"ID":"3b2b2976bec5eff564002b454ef52b93","Type":"ContainerStarted","Data":"6a810956f07797ad1edbea8eb1b8f4d6e4760942d43c37477c61be8629973787"} Apr 21 07:10:53.670160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.670079 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5hf89" event={"ID":"8352266b-7f87-4b49-9222-1a7518a8bda8","Type":"ContainerStarted","Data":"35e2e1f71c9aef28ae14d86279470311e6c7675f60c6e5ee751f0294ddc6930f"} Apr 21 07:10:53.673289 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.673237 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bs4gw" event={"ID":"918e28f2-6377-405c-885f-92621fe803a0","Type":"ContainerStarted","Data":"70ba956f9141fbd8c8fdc588fbbec3cafafd5bf1d1a95f6831de94e68f9c3208"} Apr 21 07:10:53.683477 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:53.683431 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-184.ec2.internal" podStartSLOduration=1.683416475 podStartE2EDuration="1.683416475s" podCreationTimestamp="2026-04-21 07:10:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:10:53.682282879 +0000 UTC m=+3.650659112" watchObservedRunningTime="2026-04-21 07:10:53.683416475 +0000 UTC m=+3.651792711" Apr 21 07:10:54.131128 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:54.131079 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:54.131311 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:54.131242 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:10:54.131376 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:54.131320 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs podName:6d1ac31b-8866-4817-8119-87e810a0da44 nodeName:}" failed. No retries permitted until 2026-04-21 07:10:56.131287854 +0000 UTC m=+6.099664072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs") pod "network-metrics-daemon-xxrlv" (UID: "6d1ac31b-8866-4817-8119-87e810a0da44") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:10:54.232193 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:54.232117 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:10:54.232350 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:54.232281 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:10:54.232350 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:54.232301 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:10:54.232350 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:54.232314 2567 projected.go:194] Error preparing data for projected volume kube-api-access-hz4vs for pod openshift-network-diagnostics/network-check-target-fsfmp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:10:54.232573 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:54.232372 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs podName:39152450-b5d7-466f-b0a7-58dad042db38 nodeName:}" failed. No retries permitted until 2026-04-21 07:10:56.232352989 +0000 UTC m=+6.200729206 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hz4vs" (UniqueName: "kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs") pod "network-check-target-fsfmp" (UID: "39152450-b5d7-466f-b0a7-58dad042db38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:10:54.630945 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:54.630855 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:10:54.631438 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:54.630988 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:10:54.633883 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:54.633667 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:54.633883 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:54.633777 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:10:54.684131 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:54.684090 2567 generic.go:358] "Generic (PLEG): container finished" podID="bcfcf54b7f1b4b651ff33830189a3fdf" containerID="5a3e8bd501cf799425a53c8eda893af85a480a2fa1cb4b5b8fb2ec0452eb7a6d" exitCode=0 Apr 21 07:10:54.685186 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:54.685155 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" event={"ID":"bcfcf54b7f1b4b651ff33830189a3fdf","Type":"ContainerDied","Data":"5a3e8bd501cf799425a53c8eda893af85a480a2fa1cb4b5b8fb2ec0452eb7a6d"} Apr 21 07:10:55.705166 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:55.705130 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" event={"ID":"bcfcf54b7f1b4b651ff33830189a3fdf","Type":"ContainerStarted","Data":"35a9807075bde51eee52dfd18c9d8dffc3cecb22557011fe534167898a83a8a5"} Apr 21 07:10:56.149565 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:56.149518 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:56.149747 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:56.149668 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:10:56.149747 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:56.149725 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs podName:6d1ac31b-8866-4817-8119-87e810a0da44 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:00.149707553 +0000 UTC m=+10.118083784 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs") pod "network-metrics-daemon-xxrlv" (UID: "6d1ac31b-8866-4817-8119-87e810a0da44") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:10:56.250749 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:56.250716 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:10:56.250954 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:56.250934 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:10:56.251038 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:56.250961 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:10:56.251038 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:56.251000 2567 projected.go:194] Error preparing data for projected volume kube-api-access-hz4vs for pod openshift-network-diagnostics/network-check-target-fsfmp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:10:56.251150 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:56.251085 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs podName:39152450-b5d7-466f-b0a7-58dad042db38 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:00.251066318 +0000 UTC m=+10.219442533 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hz4vs" (UniqueName: "kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs") pod "network-check-target-fsfmp" (UID: "39152450-b5d7-466f-b0a7-58dad042db38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:10:56.634414 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:56.634383 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:56.634620 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:56.634383 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:10:56.634620 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:56.634579 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:10:56.634745 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:56.634676 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:10:58.631065 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:58.631021 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:10:58.631540 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:10:58.631102 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:10:58.631540 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:58.631190 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:10:58.631540 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:10:58.631305 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:00.186015 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:00.185407 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:00.186015 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:00.185601 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:00.186015 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:00.185668 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs podName:6d1ac31b-8866-4817-8119-87e810a0da44 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:08.185649777 +0000 UTC m=+18.154025993 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs") pod "network-metrics-daemon-xxrlv" (UID: "6d1ac31b-8866-4817-8119-87e810a0da44") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:00.286317 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:00.286281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:00.286491 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:00.286443 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:00.286491 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:00.286462 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:00.286491 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:00.286474 2567 projected.go:194] Error preparing data for projected volume kube-api-access-hz4vs for pod openshift-network-diagnostics/network-check-target-fsfmp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:00.286677 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:00.286546 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs podName:39152450-b5d7-466f-b0a7-58dad042db38 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:08.286513894 +0000 UTC m=+18.254890109 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hz4vs" (UniqueName: "kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs") pod "network-check-target-fsfmp" (UID: "39152450-b5d7-466f-b0a7-58dad042db38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:00.632183 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:00.631690 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:00.632183 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:00.631799 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:00.632183 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:00.631881 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:00.632183 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:00.632006 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:02.631436 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:02.631400 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:02.631924 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:02.631543 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:02.631924 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:02.631592 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:02.631924 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:02.631689 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:04.630759 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:04.630723 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:04.631215 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:04.630723 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:04.631215 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:04.630869 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:04.631215 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:04.630907 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:05.216886 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.216838 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-184.ec2.internal" podStartSLOduration=13.216822761 podStartE2EDuration="13.216822761s" podCreationTimestamp="2026-04-21 07:10:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:10:55.726674861 +0000 UTC m=+5.695051097" watchObservedRunningTime="2026-04-21 07:11:05.216822761 +0000 UTC m=+15.185198996" Apr 21 07:11:05.217297 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.217278 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6h4bv"] Apr 21 07:11:05.247204 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.247173 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.247364 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:05.247259 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:05.321044 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.321010 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.321201 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.321110 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a531b156-35af-430e-b636-9146320cb9f5-dbus\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.321201 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.321176 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a531b156-35af-430e-b636-9146320cb9f5-kubelet-config\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.421471 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.421439 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a531b156-35af-430e-b636-9146320cb9f5-dbus\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.421642 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.421491 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a531b156-35af-430e-b636-9146320cb9f5-kubelet-config\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.421642 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.421558 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.421727 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.421633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a531b156-35af-430e-b636-9146320cb9f5-kubelet-config\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.421727 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:05.421656 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:05.421727 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.421637 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a531b156-35af-430e-b636-9146320cb9f5-dbus\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.421727 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:05.421705 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret podName:a531b156-35af-430e-b636-9146320cb9f5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:05.921691135 +0000 UTC m=+15.890067349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret") pod "global-pull-secret-syncer-6h4bv" (UID: "a531b156-35af-430e-b636-9146320cb9f5") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:05.924191 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:05.924105 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:05.924629 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:05.924233 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:05.924629 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:05.924296 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret podName:a531b156-35af-430e-b636-9146320cb9f5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:06.924280565 +0000 UTC m=+16.892656785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret") pod "global-pull-secret-syncer-6h4bv" (UID: "a531b156-35af-430e-b636-9146320cb9f5") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:06.634207 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:06.634179 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:06.634371 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:06.634225 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:06.634371 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:06.634178 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:06.634371 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:06.634297 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:06.634371 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:06.634362 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:06.634618 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:06.634454 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:06.932824 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:06.932726 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:06.933295 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:06.932838 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:06.933295 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:06.932909 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret podName:a531b156-35af-430e-b636-9146320cb9f5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:08.932888389 +0000 UTC m=+18.901264608 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret") pod "global-pull-secret-syncer-6h4bv" (UID: "a531b156-35af-430e-b636-9146320cb9f5") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:08.241269 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:08.241222 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:08.241691 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.241378 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:08.241691 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.241460 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs podName:6d1ac31b-8866-4817-8119-87e810a0da44 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:24.241439548 +0000 UTC m=+34.209815788 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs") pod "network-metrics-daemon-xxrlv" (UID: "6d1ac31b-8866-4817-8119-87e810a0da44") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:08.342353 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:08.342313 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:08.342508 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.342494 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:08.342585 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.342514 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:08.342585 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.342537 2567 projected.go:194] Error preparing data for projected volume kube-api-access-hz4vs for pod openshift-network-diagnostics/network-check-target-fsfmp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:08.342670 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.342594 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs podName:39152450-b5d7-466f-b0a7-58dad042db38 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:24.342576454 +0000 UTC m=+34.310952682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hz4vs" (UniqueName: "kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs") pod "network-check-target-fsfmp" (UID: "39152450-b5d7-466f-b0a7-58dad042db38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:08.631652 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:08.631565 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:08.631652 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:08.631590 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:08.631868 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:08.631565 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:08.631868 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.631683 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:08.631868 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.631772 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:08.631868 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.631861 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:08.946921 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:08.946832 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:08.947080 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.947009 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:08.947143 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:08.947094 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret podName:a531b156-35af-430e-b636-9146320cb9f5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:12.947077695 +0000 UTC m=+22.915453909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret") pod "global-pull-secret-syncer-6h4bv" (UID: "a531b156-35af-430e-b636-9146320cb9f5") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:10.631810 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.631584 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:10.632401 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.631657 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:10.632401 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:10.631856 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:10.632401 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.631684 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:10.632401 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:10.631900 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:10.632401 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:10.632004 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:10.732680 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.732644 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bs4gw" event={"ID":"918e28f2-6377-405c-885f-92621fe803a0","Type":"ContainerStarted","Data":"deb7aa04d5468b149b1f8dcf9e0350603e3fbf45e65c2fae56cbf01663768509"} Apr 21 07:11:10.733855 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.733829 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" event={"ID":"b044312b-3805-4344-992b-7e7befb3d7f3","Type":"ContainerStarted","Data":"1300d76322b8c6a7720a0605e2254137bec091df5703c23e76dc652026ac0777"} Apr 21 07:11:10.735102 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.735083 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7vkh9" event={"ID":"8105a8f5-e174-49e3-ba2e-c9e8b7d649a4","Type":"ContainerStarted","Data":"87ac345a3c3092f76e1863d0821ecaa656821a3fc4d7f114dd331253f46a4a73"} Apr 21 07:11:10.736566 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.736544 2567 generic.go:358] "Generic (PLEG): container finished" podID="ca549d86-e91c-4488-bfac-cf936e205050" containerID="b138a8beae99705b55ecb7d48b38bb5c13326b7d5031912045eed30b9335a319" exitCode=0 Apr 21 07:11:10.736649 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.736611 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" event={"ID":"ca549d86-e91c-4488-bfac-cf936e205050","Type":"ContainerDied","Data":"b138a8beae99705b55ecb7d48b38bb5c13326b7d5031912045eed30b9335a319"} Apr 21 07:11:10.739103 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.739086 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" event={"ID":"c395523b-6f94-447f-a14f-b3e86618c396","Type":"ContainerStarted","Data":"4210aceb16dbdb5f7d940173f022b0cae886a3f31ff9eb440470681c6230553a"} Apr 21 07:11:10.739181 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.739109 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" event={"ID":"c395523b-6f94-447f-a14f-b3e86618c396","Type":"ContainerStarted","Data":"8c2c69d2efdba626304dc38e3df9795c4806d8f5e70a0e85403507533443e581"} Apr 21 07:11:10.739181 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.739121 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" event={"ID":"c395523b-6f94-447f-a14f-b3e86618c396","Type":"ContainerStarted","Data":"508051659bcceae4ce8a737628a74cacec7731edc8cd896921e87b886b4ff2d7"} Apr 21 07:11:10.739181 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.739134 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" event={"ID":"c395523b-6f94-447f-a14f-b3e86618c396","Type":"ContainerStarted","Data":"443dd6920af5c2fb29e41aebb90aaf1964f378596eda3d1944733548afac3405"} Apr 21 07:11:10.739181 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.739148 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" event={"ID":"c395523b-6f94-447f-a14f-b3e86618c396","Type":"ContainerStarted","Data":"1d70e2d5f4217478a092f9b1d6186371f304200083340b63f69e34dd3c5c17dd"} Apr 21 07:11:10.739181 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.739157 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" event={"ID":"c395523b-6f94-447f-a14f-b3e86618c396","Type":"ContainerStarted","Data":"e20ec9d3f3eb26d8942d4112f21477462d6d8c73667ce45ad2c58f41ee3c4786"} Apr 21 07:11:10.740298 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.740282 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v524f" event={"ID":"476caf85-7f49-4bd9-944d-7dd2e7975a87","Type":"ContainerStarted","Data":"8639bef8d7c22c39281dcc477a9d9a8c1dfaabe987f3dc4a22073ca0225cd4a2"} Apr 21 07:11:10.741401 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.741383 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzpkj" event={"ID":"64fd83cc-7ef6-4cb9-892b-0111cac9771d","Type":"ContainerStarted","Data":"6ef2c097ca53170b74668c9d1be90f8d32fb726427a0d612f4a20b87281f3184"} Apr 21 07:11:10.742569 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.742553 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" event={"ID":"7c0141af-1317-4665-bd56-7841a1731312","Type":"ContainerStarted","Data":"29ce743f5b9d8a887591a8a545c35410ca87a4fce26ca19756e9fc8edab742cf"} Apr 21 07:11:10.753372 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.753337 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bs4gw" podStartSLOduration=2.809365051 podStartE2EDuration="19.753327465s" podCreationTimestamp="2026-04-21 07:10:51 +0000 UTC" firstStartedPulling="2026-04-21 07:10:53.11381245 +0000 UTC m=+3.082188664" lastFinishedPulling="2026-04-21 07:11:10.057774848 +0000 UTC m=+20.026151078" observedRunningTime="2026-04-21 07:11:10.753123589 +0000 UTC m=+20.721499825" watchObservedRunningTime="2026-04-21 07:11:10.753327465 +0000 UTC m=+20.721703697" Apr 21 07:11:10.806244 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.806192 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vxq2w" podStartSLOduration=3.905284302 podStartE2EDuration="20.806174611s" podCreationTimestamp="2026-04-21 07:10:50 +0000 UTC" firstStartedPulling="2026-04-21 07:10:53.109303517 +0000 UTC m=+3.077679730" lastFinishedPulling="2026-04-21 07:11:10.010193819 +0000 UTC m=+19.978570039" observedRunningTime="2026-04-21 07:11:10.777641474 +0000 UTC m=+20.746017713" watchObservedRunningTime="2026-04-21 07:11:10.806174611 +0000 UTC m=+20.774550848" Apr 21 07:11:10.901830 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.901736 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-v524f" podStartSLOduration=4.008227869 podStartE2EDuration="20.901721404s" podCreationTimestamp="2026-04-21 07:10:50 +0000 UTC" firstStartedPulling="2026-04-21 07:10:53.116479238 +0000 UTC m=+3.084855465" lastFinishedPulling="2026-04-21 07:11:10.009972783 +0000 UTC m=+19.978349000" observedRunningTime="2026-04-21 07:11:10.863710456 +0000 UTC m=+20.832086692" watchObservedRunningTime="2026-04-21 07:11:10.901721404 +0000 UTC m=+20.870097639" Apr 21 07:11:10.989455 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:10.989400 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qzpkj" podStartSLOduration=3.067195787 podStartE2EDuration="19.989385534s" podCreationTimestamp="2026-04-21 07:10:51 +0000 UTC" firstStartedPulling="2026-04-21 07:10:53.112416233 +0000 UTC m=+3.080792446" lastFinishedPulling="2026-04-21 07:11:10.034605963 +0000 UTC m=+20.002982193" observedRunningTime="2026-04-21 07:11:10.901686226 +0000 UTC m=+20.870062462" watchObservedRunningTime="2026-04-21 07:11:10.989385534 +0000 UTC m=+20.957761770" Apr 21 07:11:11.272392 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.272367 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 07:11:11.572109 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.571949 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T07:11:11.272389333Z","UUID":"0002253f-1618-4701-aa28-d5a848aeee8c","Handler":null,"Name":"","Endpoint":""} Apr 21 07:11:11.575232 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.575208 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 07:11:11.575360 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.575239 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 07:11:11.634710 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.634673 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-v524f" Apr 21 07:11:11.635368 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.635351 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-v524f" Apr 21 07:11:11.661868 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.661821 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7vkh9" podStartSLOduration=4.793824957 podStartE2EDuration="21.661808102s" podCreationTimestamp="2026-04-21 07:10:50 +0000 UTC" firstStartedPulling="2026-04-21 07:10:53.142144277 +0000 UTC m=+3.110520490" lastFinishedPulling="2026-04-21 07:11:10.010127412 +0000 UTC m=+19.978503635" observedRunningTime="2026-04-21 07:11:10.990819352 +0000 UTC m=+20.959195599" watchObservedRunningTime="2026-04-21 07:11:11.661808102 +0000 UTC m=+21.630184350" Apr 21 07:11:11.746345 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.746308 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" event={"ID":"b044312b-3805-4344-992b-7e7befb3d7f3","Type":"ContainerStarted","Data":"e579695df226e1934c9b566370dd3ad5decc5f7b222a43cd0817f0695d90bd8b"} Apr 21 07:11:11.747871 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.747837 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5hf89" event={"ID":"8352266b-7f87-4b49-9222-1a7518a8bda8","Type":"ContainerStarted","Data":"e9757d2cf708eed43f4c4134b4e05046ae26060e5546f3860c40eddd369aaaf9"} Apr 21 07:11:11.766826 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:11.766776 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5hf89" podStartSLOduration=4.873432198 podStartE2EDuration="21.766761886s" podCreationTimestamp="2026-04-21 07:10:50 +0000 UTC" firstStartedPulling="2026-04-21 07:10:53.13889741 +0000 UTC m=+3.107273624" lastFinishedPulling="2026-04-21 07:11:10.032227096 +0000 UTC m=+20.000603312" observedRunningTime="2026-04-21 07:11:11.766701056 +0000 UTC m=+21.735077296" watchObservedRunningTime="2026-04-21 07:11:11.766761886 +0000 UTC m=+21.735138122" Apr 21 07:11:12.631054 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:12.630967 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:12.631247 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:12.630967 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:12.631247 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:12.631107 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:12.631247 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:12.631191 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:12.631247 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:12.630968 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:12.631467 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:12.631283 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:12.754872 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:12.754636 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" event={"ID":"b044312b-3805-4344-992b-7e7befb3d7f3","Type":"ContainerStarted","Data":"469afed264a75c68bc845c7126d5db060ee1ff387e7b02945a325ec4a73556c7"} Apr 21 07:11:12.758679 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:12.758649 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" event={"ID":"c395523b-6f94-447f-a14f-b3e86618c396","Type":"ContainerStarted","Data":"f6945aeb22eba2075f033c943d95c3c69f1a260f9d52da9e178f4b50bf9ccd5e"} Apr 21 07:11:12.759042 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:12.759015 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:11:12.781577 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:12.781500 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gnx48" podStartSLOduration=2.700713699 podStartE2EDuration="21.781478931s" podCreationTimestamp="2026-04-21 07:10:51 +0000 UTC" firstStartedPulling="2026-04-21 07:10:53.109359594 +0000 UTC m=+3.077735819" lastFinishedPulling="2026-04-21 07:11:12.190124837 +0000 UTC m=+22.158501051" observedRunningTime="2026-04-21 07:11:12.781436697 +0000 UTC m=+22.749812933" watchObservedRunningTime="2026-04-21 07:11:12.781478931 +0000 UTC m=+22.749855168" Apr 21 07:11:12.977511 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:12.977426 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:12.977695 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:12.977575 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:12.977695 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:12.977645 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret podName:a531b156-35af-430e-b636-9146320cb9f5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:20.977626659 +0000 UTC m=+30.946002873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret") pod "global-pull-secret-syncer-6h4bv" (UID: "a531b156-35af-430e-b636-9146320cb9f5") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:14.452372 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:14.452322 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-v524f" Apr 21 07:11:14.452978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:14.452461 2567 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:11:14.453223 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:14.453199 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-v524f" Apr 21 07:11:14.633510 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:14.633470 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:14.633510 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:14.633511 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:14.633780 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:14.633619 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:14.633780 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:14.633709 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:14.633780 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:14.633747 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:14.633957 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:14.633815 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:15.765112 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:15.764924 2567 generic.go:358] "Generic (PLEG): container finished" podID="ca549d86-e91c-4488-bfac-cf936e205050" containerID="34cf9a2fb243dc3c29a628fc1d08736354d842f3a46c1b2fa99e2b03cba47daf" exitCode=0 Apr 21 07:11:15.766051 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:15.765001 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" event={"ID":"ca549d86-e91c-4488-bfac-cf936e205050","Type":"ContainerDied","Data":"34cf9a2fb243dc3c29a628fc1d08736354d842f3a46c1b2fa99e2b03cba47daf"} Apr 21 07:11:15.768386 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:15.768364 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" event={"ID":"c395523b-6f94-447f-a14f-b3e86618c396","Type":"ContainerStarted","Data":"43d9a34e4ce2c5426e375d4073e696e51aee76652b0fa3440cabc0bd4d187772"} Apr 21 07:11:15.768715 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:15.768697 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:11:15.768788 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:15.768721 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:11:15.768788 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:15.768732 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:11:15.783110 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:15.783080 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:11:15.783228 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:15.783145 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:11:15.825578 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:15.825470 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" podStartSLOduration=7.764055482 podStartE2EDuration="24.825448825s" podCreationTimestamp="2026-04-21 07:10:51 +0000 UTC" firstStartedPulling="2026-04-21 07:10:53.138733841 +0000 UTC m=+3.107110055" lastFinishedPulling="2026-04-21 07:11:10.200127169 +0000 UTC m=+20.168503398" observedRunningTime="2026-04-21 07:11:15.82409115 +0000 UTC m=+25.792467387" watchObservedRunningTime="2026-04-21 07:11:15.825448825 +0000 UTC m=+25.793825059" Apr 21 07:11:16.631052 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.631014 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:16.631203 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:16.631164 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:16.631278 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.631261 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:16.631403 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:16.631374 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:16.631518 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.631407 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:16.631611 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:16.631549 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:16.774396 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.774363 2567 generic.go:358] "Generic (PLEG): container finished" podID="ca549d86-e91c-4488-bfac-cf936e205050" containerID="52c25d3f8d79a13ef2d76575ceeb77a7ba36af39cf5b16a3cdb9c70badad4e51" exitCode=0 Apr 21 07:11:16.774804 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.774446 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" event={"ID":"ca549d86-e91c-4488-bfac-cf936e205050","Type":"ContainerDied","Data":"52c25d3f8d79a13ef2d76575ceeb77a7ba36af39cf5b16a3cdb9c70badad4e51"} Apr 21 07:11:16.877486 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.877408 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6h4bv"] Apr 21 07:11:16.877633 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.877500 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:16.877633 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:16.877596 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:16.880911 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.880881 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fsfmp"] Apr 21 07:11:16.881060 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.880975 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:16.881120 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:16.881065 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:16.881368 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.881347 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xxrlv"] Apr 21 07:11:16.881469 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:16.881424 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:16.881544 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:16.881510 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:17.778513 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:17.778421 2567 generic.go:358] "Generic (PLEG): container finished" podID="ca549d86-e91c-4488-bfac-cf936e205050" containerID="b0d4e070730fefd1b17868e06d074220a7d44233b1e675c91e314fb80c393117" exitCode=0 Apr 21 07:11:17.778895 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:17.778509 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" event={"ID":"ca549d86-e91c-4488-bfac-cf936e205050","Type":"ContainerDied","Data":"b0d4e070730fefd1b17868e06d074220a7d44233b1e675c91e314fb80c393117"} Apr 21 07:11:18.630761 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:18.630721 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:18.630761 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:18.630743 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:18.631015 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:18.630727 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:18.631015 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:18.630859 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:18.631015 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:18.630944 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:18.631189 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:18.631046 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:20.632996 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:20.632792 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:20.633658 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:20.632861 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:20.633658 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:20.633085 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:20.633658 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:20.632890 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:20.633658 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:20.633181 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:20.633658 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:20.633329 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:21.036349 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:21.036268 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:21.036498 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:21.036417 2567 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:21.036554 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:21.036505 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret podName:a531b156-35af-430e-b636-9146320cb9f5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:37.03648897 +0000 UTC m=+47.004865184 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret") pod "global-pull-secret-syncer-6h4bv" (UID: "a531b156-35af-430e-b636-9146320cb9f5") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:11:22.630905 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.630864 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:22.631479 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.630864 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:22.631479 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:22.631001 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fsfmp" podUID="39152450-b5d7-466f-b0a7-58dad042db38" Apr 21 07:11:22.631479 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:22.631099 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6h4bv" podUID="a531b156-35af-430e-b636-9146320cb9f5" Apr 21 07:11:22.631479 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.631155 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:22.631479 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:22.631234 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xxrlv" podUID="6d1ac31b-8866-4817-8119-87e810a0da44" Apr 21 07:11:22.887724 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.887638 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-184.ec2.internal" event="NodeReady" Apr 21 07:11:22.887878 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.887791 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 07:11:22.946088 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.946048 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk"] Apr 21 07:11:22.972502 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.972398 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c96f69849-stwdh"] Apr 21 07:11:22.973431 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.972610 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:22.975901 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.975875 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 07:11:22.976032 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.975910 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 07:11:22.976032 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.976025 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 07:11:22.976294 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.976273 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 07:11:22.987515 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.987492 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f"] Apr 21 07:11:22.987675 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.987651 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:22.995276 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.995250 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 07:11:22.995276 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.995266 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 07:11:22.995581 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.995559 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sgb5p\"" Apr 21 07:11:22.995906 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:22.995888 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 07:11:23.005432 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.005406 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd"] Apr 21 07:11:23.005576 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.005560 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.005849 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.005822 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 07:11:23.014620 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.011810 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 07:11:23.017123 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.017100 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c96f69849-stwdh"] Apr 21 07:11:23.017233 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.017129 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd"] Apr 21 07:11:23.017233 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.017142 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f"] Apr 21 07:11:23.017233 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.017153 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk"] Apr 21 07:11:23.017414 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.017244 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qn9pw"] Apr 21 07:11:23.017414 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.017247 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" Apr 21 07:11:23.017544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.017412 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 07:11:23.017700 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.017653 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 07:11:23.017989 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.017736 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 07:11:23.019624 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.019599 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-qbcdw\"" Apr 21 07:11:23.020672 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.020655 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 07:11:23.032331 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.032303 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qb2h5"] Apr 21 07:11:23.032462 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.032442 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:23.036208 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.036027 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 07:11:23.036208 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.036091 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 07:11:23.036208 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.036112 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 07:11:23.036208 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.036182 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vr4v6\"" Apr 21 07:11:23.051393 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.051351 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8ac04be3-b26f-43a1-9f42-3f65f3b37503-klusterlet-config\") pod \"klusterlet-addon-workmgr-994b75948-dv2sk\" (UID: \"8ac04be3-b26f-43a1-9f42-3f65f3b37503\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.051393 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.051386 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qn9pw"] Apr 21 07:11:23.051654 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.051413 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qb2h5"] Apr 21 07:11:23.051654 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.051460 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc42r\" (UniqueName: \"kubernetes.io/projected/8ac04be3-b26f-43a1-9f42-3f65f3b37503-kube-api-access-lc42r\") pod \"klusterlet-addon-workmgr-994b75948-dv2sk\" (UID: \"8ac04be3-b26f-43a1-9f42-3f65f3b37503\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.051654 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.051589 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.051794 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.051777 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ac04be3-b26f-43a1-9f42-3f65f3b37503-tmp\") pod \"klusterlet-addon-workmgr-994b75948-dv2sk\" (UID: \"8ac04be3-b26f-43a1-9f42-3f65f3b37503\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.057579 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.057557 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 07:11:23.057579 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.057575 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-b24jv\"" Apr 21 07:11:23.057764 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.057587 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 07:11:23.152326 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152233 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:23.152326 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152308 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc42r\" (UniqueName: \"kubernetes.io/projected/8ac04be3-b26f-43a1-9f42-3f65f3b37503-kube-api-access-lc42r\") pod \"klusterlet-addon-workmgr-994b75948-dv2sk\" (UID: \"8ac04be3-b26f-43a1-9f42-3f65f3b37503\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.152576 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152342 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.152576 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152363 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12940af0-6363-4ae3-bd15-0431283aae9a-config-volume\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.152576 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152387 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgmqg\" (UniqueName: \"kubernetes.io/projected/12940af0-6363-4ae3-bd15-0431283aae9a-kube-api-access-pgmqg\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.152576 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152423 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-certificates\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.152576 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152462 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8lzj\" (UniqueName: \"kubernetes.io/projected/212502a2-9d42-4548-be4e-1a54064ecdf5-kube-api-access-d8lzj\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:23.152576 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152503 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-ca\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.152576 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-trusted-ca\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152627 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zsrz\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-kube-api-access-7zsrz\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ac04be3-b26f-43a1-9f42-3f65f3b37503-tmp\") pod \"klusterlet-addon-workmgr-994b75948-dv2sk\" (UID: \"8ac04be3-b26f-43a1-9f42-3f65f3b37503\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152693 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-hub\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d605a4c8-bdf6-482a-9491-bc1262224419-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152741 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152770 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/12940af0-6363-4ae3-bd15-0431283aae9a-tmp-dir\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152799 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-ca-trust-extracted\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152831 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8ac04be3-b26f-43a1-9f42-3f65f3b37503-klusterlet-config\") pod \"klusterlet-addon-workmgr-994b75948-dv2sk\" (UID: \"8ac04be3-b26f-43a1-9f42-3f65f3b37503\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152892 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c4dd\" (UniqueName: \"kubernetes.io/projected/0124a042-96fc-4c9a-92ea-2fb1e7d195fe-kube-api-access-4c4dd\") pod \"managed-serviceaccount-addon-agent-f7c86dfc-zvgmd\" (UID: \"0124a042-96fc-4c9a-92ea-2fb1e7d195fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" Apr 21 07:11:23.152942 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c698\" (UniqueName: \"kubernetes.io/projected/d605a4c8-bdf6-482a-9491-bc1262224419-kube-api-access-6c698\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.153509 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152951 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.153509 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.152973 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-image-registry-private-configuration\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.153509 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.153022 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-bound-sa-token\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.153509 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.153061 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-installation-pull-secrets\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.153509 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.153087 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0124a042-96fc-4c9a-92ea-2fb1e7d195fe-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f7c86dfc-zvgmd\" (UID: \"0124a042-96fc-4c9a-92ea-2fb1e7d195fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" Apr 21 07:11:23.153509 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.153355 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8ac04be3-b26f-43a1-9f42-3f65f3b37503-tmp\") pod \"klusterlet-addon-workmgr-994b75948-dv2sk\" (UID: \"8ac04be3-b26f-43a1-9f42-3f65f3b37503\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.157899 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.157876 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8ac04be3-b26f-43a1-9f42-3f65f3b37503-klusterlet-config\") pod \"klusterlet-addon-workmgr-994b75948-dv2sk\" (UID: \"8ac04be3-b26f-43a1-9f42-3f65f3b37503\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.171943 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.171920 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc42r\" (UniqueName: \"kubernetes.io/projected/8ac04be3-b26f-43a1-9f42-3f65f3b37503-kube-api-access-lc42r\") pod \"klusterlet-addon-workmgr-994b75948-dv2sk\" (UID: \"8ac04be3-b26f-43a1-9f42-3f65f3b37503\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.254096 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254057 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4c4dd\" (UniqueName: \"kubernetes.io/projected/0124a042-96fc-4c9a-92ea-2fb1e7d195fe-kube-api-access-4c4dd\") pod \"managed-serviceaccount-addon-agent-f7c86dfc-zvgmd\" (UID: \"0124a042-96fc-4c9a-92ea-2fb1e7d195fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" Apr 21 07:11:23.254307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c698\" (UniqueName: \"kubernetes.io/projected/d605a4c8-bdf6-482a-9491-bc1262224419-kube-api-access-6c698\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.254307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254141 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.254307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-image-registry-private-configuration\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.254307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254204 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-bound-sa-token\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.254307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254237 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-installation-pull-secrets\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.254307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254263 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0124a042-96fc-4c9a-92ea-2fb1e7d195fe-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f7c86dfc-zvgmd\" (UID: \"0124a042-96fc-4c9a-92ea-2fb1e7d195fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" Apr 21 07:11:23.254307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254283 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:23.254307 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.254300 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254321 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12940af0-6363-4ae3-bd15-0431283aae9a-config-volume\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.254374 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls podName:12940af0-6363-4ae3-bd15-0431283aae9a nodeName:}" failed. No retries permitted until 2026-04-21 07:11:23.754355169 +0000 UTC m=+33.722731388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls") pod "dns-default-qb2h5" (UID: "12940af0-6363-4ae3-bd15-0431283aae9a") : secret "dns-default-metrics-tls" not found Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254399 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgmqg\" (UniqueName: \"kubernetes.io/projected/12940af0-6363-4ae3-bd15-0431283aae9a-kube-api-access-pgmqg\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.254434 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.254480 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert podName:212502a2-9d42-4548-be4e-1a54064ecdf5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:23.754468858 +0000 UTC m=+33.722845074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert") pod "ingress-canary-qn9pw" (UID: "212502a2-9d42-4548-be4e-1a54064ecdf5") : secret "canary-serving-cert" not found Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254432 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-certificates\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254511 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8lzj\" (UniqueName: \"kubernetes.io/projected/212502a2-9d42-4548-be4e-1a54064ecdf5-kube-api-access-d8lzj\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254583 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-ca\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254608 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-trusted-ca\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254659 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zsrz\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-kube-api-access-7zsrz\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254691 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-hub\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254729 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d605a4c8-bdf6-482a-9491-bc1262224419-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.255089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254926 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254937 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12940af0-6363-4ae3-bd15-0431283aae9a-config-volume\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254959 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/12940af0-6363-4ae3-bd15-0431283aae9a-tmp-dir\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.254986 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-ca-trust-extracted\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.255048 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-certificates\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.255354 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-ca-trust-extracted\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.255356 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d605a4c8-bdf6-482a-9491-bc1262224419-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.255448 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.255460 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c96f69849-stwdh: secret "image-registry-tls" not found Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.255502 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls podName:770fe5c5-6bbd-4902-9a64-b38c2ad3329e nodeName:}" failed. No retries permitted until 2026-04-21 07:11:23.755486706 +0000 UTC m=+33.723862938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls") pod "image-registry-5c96f69849-stwdh" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e") : secret "image-registry-tls" not found Apr 21 07:11:23.256046 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.255605 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/12940af0-6363-4ae3-bd15-0431283aae9a-tmp-dir\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.256550 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.256405 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-trusted-ca\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.257979 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.257567 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0124a042-96fc-4c9a-92ea-2fb1e7d195fe-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f7c86dfc-zvgmd\" (UID: \"0124a042-96fc-4c9a-92ea-2fb1e7d195fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" Apr 21 07:11:23.257979 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.257846 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-image-registry-private-configuration\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.257979 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.257929 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.257979 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.257942 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-ca\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.258225 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.258081 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.258711 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.258687 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d605a4c8-bdf6-482a-9491-bc1262224419-hub\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.258795 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.258731 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-installation-pull-secrets\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.273690 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.273644 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgmqg\" (UniqueName: \"kubernetes.io/projected/12940af0-6363-4ae3-bd15-0431283aae9a-kube-api-access-pgmqg\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.273829 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.273776 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zsrz\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-kube-api-access-7zsrz\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.273829 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.273765 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-bound-sa-token\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.274227 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.274185 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c4dd\" (UniqueName: \"kubernetes.io/projected/0124a042-96fc-4c9a-92ea-2fb1e7d195fe-kube-api-access-4c4dd\") pod \"managed-serviceaccount-addon-agent-f7c86dfc-zvgmd\" (UID: \"0124a042-96fc-4c9a-92ea-2fb1e7d195fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" Apr 21 07:11:23.276205 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.276163 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8lzj\" (UniqueName: \"kubernetes.io/projected/212502a2-9d42-4548-be4e-1a54064ecdf5-kube-api-access-d8lzj\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:23.282053 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.282026 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c698\" (UniqueName: \"kubernetes.io/projected/d605a4c8-bdf6-482a-9491-bc1262224419-kube-api-access-6c698\") pod \"cluster-proxy-proxy-agent-5ddb8d5989-sz45f\" (UID: \"d605a4c8-bdf6-482a-9491-bc1262224419\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.286653 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.286617 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:23.332995 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.332962 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:11:23.335701 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.335682 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" Apr 21 07:11:23.742351 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.741786 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd"] Apr 21 07:11:23.759118 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.759093 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:23.759229 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.759149 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:23.759229 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.759200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:23.759295 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.759243 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:23.759328 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.759299 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:23.759328 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.759308 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert podName:212502a2-9d42-4548-be4e-1a54064ecdf5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:24.75928692 +0000 UTC m=+34.727663151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert") pod "ingress-canary-qn9pw" (UID: "212502a2-9d42-4548-be4e-1a54064ecdf5") : secret "canary-serving-cert" not found Apr 21 07:11:23.759328 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.759315 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c96f69849-stwdh: secret "image-registry-tls" not found Apr 21 07:11:23.759449 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.759349 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:23.759449 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.759366 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls podName:770fe5c5-6bbd-4902-9a64-b38c2ad3329e nodeName:}" failed. No retries permitted until 2026-04-21 07:11:24.75934973 +0000 UTC m=+34.727725958 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls") pod "image-registry-5c96f69849-stwdh" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e") : secret "image-registry-tls" not found Apr 21 07:11:23.759449 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:23.759402 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls podName:12940af0-6363-4ae3-bd15-0431283aae9a nodeName:}" failed. No retries permitted until 2026-04-21 07:11:24.75938837 +0000 UTC m=+34.727764598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls") pod "dns-default-qb2h5" (UID: "12940af0-6363-4ae3-bd15-0431283aae9a") : secret "dns-default-metrics-tls" not found Apr 21 07:11:23.763544 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.763509 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f"] Apr 21 07:11:23.764232 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:23.764210 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk"] Apr 21 07:11:23.879059 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:23.878967 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0124a042_96fc_4c9a_92ea_2fb1e7d195fe.slice/crio-047dc33d5905446b4e2c08d68995367cc46a8802ae154a79aebde25cf1294c19 WatchSource:0}: Error finding container 047dc33d5905446b4e2c08d68995367cc46a8802ae154a79aebde25cf1294c19: Status 404 returned error can't find the container with id 047dc33d5905446b4e2c08d68995367cc46a8802ae154a79aebde25cf1294c19 Apr 21 07:11:23.879366 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:23.879345 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac04be3_b26f_43a1_9f42_3f65f3b37503.slice/crio-afafa1a5ee84ea8d5db08f538b46d9a6c7881a01f7a017ad03ecea7d52e4f456 WatchSource:0}: Error finding container afafa1a5ee84ea8d5db08f538b46d9a6c7881a01f7a017ad03ecea7d52e4f456: Status 404 returned error can't find the container with id afafa1a5ee84ea8d5db08f538b46d9a6c7881a01f7a017ad03ecea7d52e4f456 Apr 21 07:11:23.880016 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:23.879990 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd605a4c8_bdf6_482a_9491_bc1262224419.slice/crio-52452db26335315dfdab0348eb441cdd8510245fe16fec8b01f0a2d6dc488d4e WatchSource:0}: Error finding container 52452db26335315dfdab0348eb441cdd8510245fe16fec8b01f0a2d6dc488d4e: Status 404 returned error can't find the container with id 52452db26335315dfdab0348eb441cdd8510245fe16fec8b01f0a2d6dc488d4e Apr 21 07:11:24.263758 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.263722 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:24.263908 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.263882 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:24.263959 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.263949 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs podName:6d1ac31b-8866-4817-8119-87e810a0da44 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:56.263935587 +0000 UTC m=+66.232311801 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs") pod "network-metrics-daemon-xxrlv" (UID: "6d1ac31b-8866-4817-8119-87e810a0da44") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:11:24.364472 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.364435 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:24.364672 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.364642 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:11:24.364672 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.364661 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:11:24.364745 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.364675 2567 projected.go:194] Error preparing data for projected volume kube-api-access-hz4vs for pod openshift-network-diagnostics/network-check-target-fsfmp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:24.364745 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.364733 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs podName:39152450-b5d7-466f-b0a7-58dad042db38 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:56.364718987 +0000 UTC m=+66.333095205 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hz4vs" (UniqueName: "kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs") pod "network-check-target-fsfmp" (UID: "39152450-b5d7-466f-b0a7-58dad042db38") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:11:24.635482 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.635283 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:24.635482 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.635284 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:24.635752 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.635282 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:24.638132 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.638104 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:11:24.638964 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.638941 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7djql\"" Apr 21 07:11:24.639080 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.639006 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-92shx\"" Apr 21 07:11:24.639154 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.638943 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:11:24.639228 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.639207 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:11:24.639363 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.639342 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 07:11:24.767235 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.767192 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:24.767738 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.767298 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:24.767738 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.767353 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:24.767738 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.767496 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:24.767738 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.767571 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert podName:212502a2-9d42-4548-be4e-1a54064ecdf5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:26.767552423 +0000 UTC m=+36.735928642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert") pod "ingress-canary-qn9pw" (UID: "212502a2-9d42-4548-be4e-1a54064ecdf5") : secret "canary-serving-cert" not found Apr 21 07:11:24.768011 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.767991 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:24.768071 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.768015 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c96f69849-stwdh: secret "image-registry-tls" not found Apr 21 07:11:24.768071 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.768059 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls podName:770fe5c5-6bbd-4902-9a64-b38c2ad3329e nodeName:}" failed. No retries permitted until 2026-04-21 07:11:26.768044542 +0000 UTC m=+36.736420760 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls") pod "image-registry-5c96f69849-stwdh" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e") : secret "image-registry-tls" not found Apr 21 07:11:24.768176 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.768119 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:24.768176 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:24.768152 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls podName:12940af0-6363-4ae3-bd15-0431283aae9a nodeName:}" failed. No retries permitted until 2026-04-21 07:11:26.768140961 +0000 UTC m=+36.736517177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls") pod "dns-default-qb2h5" (UID: "12940af0-6363-4ae3-bd15-0431283aae9a") : secret "dns-default-metrics-tls" not found Apr 21 07:11:24.795871 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.795834 2567 generic.go:358] "Generic (PLEG): container finished" podID="ca549d86-e91c-4488-bfac-cf936e205050" containerID="5005085745bd07077c666fc9caf1ac8f7ad0a0f833c919f8ac0392f56f90e24c" exitCode=0 Apr 21 07:11:24.796040 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.795928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" event={"ID":"ca549d86-e91c-4488-bfac-cf936e205050","Type":"ContainerDied","Data":"5005085745bd07077c666fc9caf1ac8f7ad0a0f833c919f8ac0392f56f90e24c"} Apr 21 07:11:24.799480 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.799444 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" event={"ID":"8ac04be3-b26f-43a1-9f42-3f65f3b37503","Type":"ContainerStarted","Data":"afafa1a5ee84ea8d5db08f538b46d9a6c7881a01f7a017ad03ecea7d52e4f456"} Apr 21 07:11:24.801650 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.801123 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" event={"ID":"d605a4c8-bdf6-482a-9491-bc1262224419","Type":"ContainerStarted","Data":"52452db26335315dfdab0348eb441cdd8510245fe16fec8b01f0a2d6dc488d4e"} Apr 21 07:11:24.805231 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:24.805158 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" event={"ID":"0124a042-96fc-4c9a-92ea-2fb1e7d195fe","Type":"ContainerStarted","Data":"047dc33d5905446b4e2c08d68995367cc46a8802ae154a79aebde25cf1294c19"} Apr 21 07:11:25.812805 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:25.812557 2567 generic.go:358] "Generic (PLEG): container finished" podID="ca549d86-e91c-4488-bfac-cf936e205050" containerID="9da6bf55da4099dc9527ce4c58c771183c51df1e91d96a4b8df1484043c44271" exitCode=0 Apr 21 07:11:25.812805 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:25.812604 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" event={"ID":"ca549d86-e91c-4488-bfac-cf936e205050","Type":"ContainerDied","Data":"9da6bf55da4099dc9527ce4c58c771183c51df1e91d96a4b8df1484043c44271"} Apr 21 07:11:26.787629 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:26.787590 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:26.787801 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:26.787650 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:26.787801 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:26.787698 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:26.787912 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:26.787825 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:26.787912 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:26.787837 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c96f69849-stwdh: secret "image-registry-tls" not found Apr 21 07:11:26.787912 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:26.787883 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls podName:770fe5c5-6bbd-4902-9a64-b38c2ad3329e nodeName:}" failed. No retries permitted until 2026-04-21 07:11:30.787868899 +0000 UTC m=+40.756245126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls") pod "image-registry-5c96f69849-stwdh" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e") : secret "image-registry-tls" not found Apr 21 07:11:26.788057 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:26.788009 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:26.788057 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:26.788039 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:26.788164 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:26.788082 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls podName:12940af0-6363-4ae3-bd15-0431283aae9a nodeName:}" failed. No retries permitted until 2026-04-21 07:11:30.788062349 +0000 UTC m=+40.756438582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls") pod "dns-default-qb2h5" (UID: "12940af0-6363-4ae3-bd15-0431283aae9a") : secret "dns-default-metrics-tls" not found Apr 21 07:11:26.788164 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:26.788103 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert podName:212502a2-9d42-4548-be4e-1a54064ecdf5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:30.788092374 +0000 UTC m=+40.756468594 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert") pod "ingress-canary-qn9pw" (UID: "212502a2-9d42-4548-be4e-1a54064ecdf5") : secret "canary-serving-cert" not found Apr 21 07:11:30.821575 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.821513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:30.822058 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.821595 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:30.822058 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.821640 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:30.822058 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:30.821685 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:30.822058 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:30.821726 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:30.822058 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:30.821737 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c96f69849-stwdh: secret "image-registry-tls" not found Apr 21 07:11:30.822058 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:30.821738 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:30.822058 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:30.821771 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls podName:12940af0-6363-4ae3-bd15-0431283aae9a nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.821749745 +0000 UTC m=+48.790125961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls") pod "dns-default-qb2h5" (UID: "12940af0-6363-4ae3-bd15-0431283aae9a") : secret "dns-default-metrics-tls" not found Apr 21 07:11:30.822058 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:30.821791 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls podName:770fe5c5-6bbd-4902-9a64-b38c2ad3329e nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.821781634 +0000 UTC m=+48.790157848 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls") pod "image-registry-5c96f69849-stwdh" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e") : secret "image-registry-tls" not found Apr 21 07:11:30.822058 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:30.821805 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert podName:212502a2-9d42-4548-be4e-1a54064ecdf5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.821797437 +0000 UTC m=+48.790173651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert") pod "ingress-canary-qn9pw" (UID: "212502a2-9d42-4548-be4e-1a54064ecdf5") : secret "canary-serving-cert" not found Apr 21 07:11:30.824079 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.824058 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" event={"ID":"8ac04be3-b26f-43a1-9f42-3f65f3b37503","Type":"ContainerStarted","Data":"17962798d86c816dbc24ab95b58701779007c8206695fb1adca3ef2b06d6dd03"} Apr 21 07:11:30.824291 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.824267 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:30.825476 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.825453 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" event={"ID":"d605a4c8-bdf6-482a-9491-bc1262224419","Type":"ContainerStarted","Data":"a87053d63bb6c7ebde0e0674f78212ba1cf3c738eafab3a1f827b794050aec43"} Apr 21 07:11:30.826015 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.825998 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" Apr 21 07:11:30.826800 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.826779 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" event={"ID":"0124a042-96fc-4c9a-92ea-2fb1e7d195fe","Type":"ContainerStarted","Data":"4256fdfed56f7c360fb046801f1f8ce94b81693d64389e8d8e1fde7a5612c033"} Apr 21 07:11:30.829561 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.829543 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" event={"ID":"ca549d86-e91c-4488-bfac-cf936e205050","Type":"ContainerStarted","Data":"7e77f4b7689b7d8aaaa445565e48489fd0e1726829b4fa9dcd877dcda88d87c6"} Apr 21 07:11:30.844493 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.844443 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-994b75948-dv2sk" podStartSLOduration=4.959846383 podStartE2EDuration="10.844430063s" podCreationTimestamp="2026-04-21 07:11:20 +0000 UTC" firstStartedPulling="2026-04-21 07:11:23.895294022 +0000 UTC m=+33.863670249" lastFinishedPulling="2026-04-21 07:11:29.779877715 +0000 UTC m=+39.748253929" observedRunningTime="2026-04-21 07:11:30.844012668 +0000 UTC m=+40.812388898" watchObservedRunningTime="2026-04-21 07:11:30.844430063 +0000 UTC m=+40.812806300" Apr 21 07:11:30.862878 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.862834 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f7c86dfc-zvgmd" podStartSLOduration=4.995514642 podStartE2EDuration="10.862823357s" podCreationTimestamp="2026-04-21 07:11:20 +0000 UTC" firstStartedPulling="2026-04-21 07:11:23.895436214 +0000 UTC m=+33.863812443" lastFinishedPulling="2026-04-21 07:11:29.762744931 +0000 UTC m=+39.731121158" observedRunningTime="2026-04-21 07:11:30.862374909 +0000 UTC m=+40.830751145" watchObservedRunningTime="2026-04-21 07:11:30.862823357 +0000 UTC m=+40.831199593" Apr 21 07:11:30.904575 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:30.904503 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9t6vk" podStartSLOduration=10.123728181 podStartE2EDuration="40.904490261s" podCreationTimestamp="2026-04-21 07:10:50 +0000 UTC" firstStartedPulling="2026-04-21 07:10:53.138789735 +0000 UTC m=+3.107165948" lastFinishedPulling="2026-04-21 07:11:23.919551811 +0000 UTC m=+33.887928028" observedRunningTime="2026-04-21 07:11:30.903359977 +0000 UTC m=+40.871736213" watchObservedRunningTime="2026-04-21 07:11:30.904490261 +0000 UTC m=+40.872866497" Apr 21 07:11:32.834479 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:32.834450 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" event={"ID":"d605a4c8-bdf6-482a-9491-bc1262224419","Type":"ContainerStarted","Data":"d4f84ec55d987d41902baf30467e5aef1b25ef7363cb68e27f5aa7e18dc7d948"} Apr 21 07:11:33.838572 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:33.838515 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" event={"ID":"d605a4c8-bdf6-482a-9491-bc1262224419","Type":"ContainerStarted","Data":"5c8ed5348a018dd92808ca0d052ec3a1a5c0bda69cceddb03db74dbe9891d586"} Apr 21 07:11:33.860323 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:33.860276 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" podStartSLOduration=5.103526446 podStartE2EDuration="13.86026139s" podCreationTimestamp="2026-04-21 07:11:20 +0000 UTC" firstStartedPulling="2026-04-21 07:11:23.895292817 +0000 UTC m=+33.863669031" lastFinishedPulling="2026-04-21 07:11:32.652027747 +0000 UTC m=+42.620403975" observedRunningTime="2026-04-21 07:11:33.860009794 +0000 UTC m=+43.828386030" watchObservedRunningTime="2026-04-21 07:11:33.86026139 +0000 UTC m=+43.828637625" Apr 21 07:11:36.747331 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.747288 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4"] Apr 21 07:11:36.781193 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.781163 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4"] Apr 21 07:11:36.781348 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.781282 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4" Apr 21 07:11:36.783847 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.783823 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 07:11:36.784676 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.784655 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:11:36.784779 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.784655 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-bndn8\"" Apr 21 07:11:36.854645 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.854610 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2"] Apr 21 07:11:36.878181 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.878147 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p"] Apr 21 07:11:36.878362 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.878292 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:36.880708 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.880683 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 07:11:36.880823 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.880733 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 07:11:36.881045 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.881029 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-srwvp\"" Apr 21 07:11:36.881891 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.881874 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 07:11:36.881983 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.881897 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 07:11:36.893783 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.893754 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8ckzx"] Apr 21 07:11:36.893949 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.893826 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:36.897334 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.896497 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 07:11:36.897334 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.896513 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-x6bq9\"" Apr 21 07:11:36.897334 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.896497 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:11:36.897334 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.896839 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 07:11:36.926135 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.926107 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5nqns"] Apr 21 07:11:36.926303 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.926261 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:36.930739 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.930673 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 07:11:36.930739 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.930676 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 07:11:36.930739 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.930706 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-jcsx5\"" Apr 21 07:11:36.932925 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.932903 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 07:11:36.933348 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.933331 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:11:36.938810 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.938789 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 07:11:36.947294 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.947275 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6965f7486b-s6vrq"] Apr 21 07:11:36.947422 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.947409 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:36.950439 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.950410 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 07:11:36.950439 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.950410 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-gmc68\"" Apr 21 07:11:36.950640 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.950509 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 07:11:36.950640 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.950611 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 07:11:36.950792 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.950751 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 07:11:36.956690 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.956666 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 07:11:36.962852 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.962828 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p"] Apr 21 07:11:36.962962 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.962858 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2"] Apr 21 07:11:36.962962 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.962866 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8ckzx"] Apr 21 07:11:36.962962 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.962874 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5nqns"] Apr 21 07:11:36.962962 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.962884 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6965f7486b-s6vrq"] Apr 21 07:11:36.962962 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.962909 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts"] Apr 21 07:11:36.963238 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.962973 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:36.968070 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.968051 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 07:11:36.968180 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.968116 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 07:11:36.968180 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.968125 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 07:11:36.968180 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.968162 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-x4czj\"" Apr 21 07:11:36.968180 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.968121 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 07:11:36.968701 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.968675 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 07:11:36.968701 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.968678 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 07:11:36.975495 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.975473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4ggt\" (UniqueName: \"kubernetes.io/projected/01991e8c-8e7b-4dbb-97df-6a5c1999f0eb-kube-api-access-v4ggt\") pod \"volume-data-source-validator-7c6cbb6c87-l47f4\" (UID: \"01991e8c-8e7b-4dbb-97df-6a5c1999f0eb\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4" Apr 21 07:11:36.985176 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.985151 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ngssz"] Apr 21 07:11:36.985326 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.985309 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:36.987649 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.987627 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 07:11:36.987981 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.987961 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:11:36.988094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.988055 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-xp8hb\"" Apr 21 07:11:36.988094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.988060 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 07:11:36.988319 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:36.988288 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 07:11:37.001050 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.000995 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb"] Apr 21 07:11:37.001157 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.001141 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:37.004827 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.004807 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 07:11:37.005049 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.005035 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 07:11:37.005645 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.005628 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-44s8p\"" Apr 21 07:11:37.021600 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.021579 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff"] Apr 21 07:11:37.021749 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.021733 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.024017 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.023990 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 07:11:37.024112 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.024001 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:11:37.024112 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.024079 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 07:11:37.024438 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.024420 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 07:11:37.024684 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.024659 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-24588\"" Apr 21 07:11:37.036897 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.036875 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts"] Apr 21 07:11:37.036897 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.036898 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ngssz"] Apr 21 07:11:37.037053 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.036909 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff"] Apr 21 07:11:37.037053 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.036916 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb"] Apr 21 07:11:37.037053 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.037025 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff" Apr 21 07:11:37.039985 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.039724 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-8l5cb\"" Apr 21 07:11:37.076182 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076150 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.076182 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076181 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdjkr\" (UniqueName: \"kubernetes.io/projected/6711400f-c84e-4ed4-a4b1-515e9d0818a7-kube-api-access-vdjkr\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:37.076406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076209 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4ggt\" (UniqueName: \"kubernetes.io/projected/01991e8c-8e7b-4dbb-97df-6a5c1999f0eb-kube-api-access-v4ggt\") pod \"volume-data-source-validator-7c6cbb6c87-l47f4\" (UID: \"01991e8c-8e7b-4dbb-97df-6a5c1999f0eb\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4" Apr 21 07:11:37.076406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076229 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:37.076406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076310 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ec0ae8-afa6-40cd-943d-465f66eaed59-config\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.076406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076346 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8b96\" (UniqueName: \"kubernetes.io/projected/f2ec0ae8-afa6-40cd-943d-465f66eaed59-kube-api-access-q8b96\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.076406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076364 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aef3d006-e32d-47ee-b413-d3d17fa20b47-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.076406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-default-certificate\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.076406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076402 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076420 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076446 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aef3d006-e32d-47ee-b413-d3d17fa20b47-service-ca-bundle\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076460 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aef3d006-e32d-47ee-b413-d3d17fa20b47-serving-cert\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076475 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/aef3d006-e32d-47ee-b413-d3d17fa20b47-snapshots\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076491 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6711400f-c84e-4ed4-a4b1-515e9d0818a7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076513 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aef3d006-e32d-47ee-b413-d3d17fa20b47-tmp\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076584 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf7pb\" (UniqueName: \"kubernetes.io/projected/d3bff263-210d-4a8a-baab-642a7254c4f8-kube-api-access-sf7pb\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076620 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076645 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xwtx\" (UniqueName: \"kubernetes.io/projected/aef3d006-e32d-47ee-b413-d3d17fa20b47-kube-api-access-8xwtx\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076666 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2ec0ae8-afa6-40cd-943d-465f66eaed59-trusted-ca\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.076695 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076687 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ec0ae8-afa6-40cd-943d-465f66eaed59-serving-cert\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.077096 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076709 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbznv\" (UniqueName: \"kubernetes.io/projected/68739b76-259c-44cc-ae50-18c754490061-kube-api-access-zbznv\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:37.077096 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.076785 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-stats-auth\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.080053 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.080028 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a531b156-35af-430e-b636-9146320cb9f5-original-pull-secret\") pod \"global-pull-secret-syncer-6h4bv\" (UID: \"a531b156-35af-430e-b636-9146320cb9f5\") " pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:37.086689 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.086668 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4ggt\" (UniqueName: \"kubernetes.io/projected/01991e8c-8e7b-4dbb-97df-6a5c1999f0eb-kube-api-access-v4ggt\") pod \"volume-data-source-validator-7c6cbb6c87-l47f4\" (UID: \"01991e8c-8e7b-4dbb-97df-6a5c1999f0eb\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4" Apr 21 07:11:37.090422 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.090397 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4" Apr 21 07:11:37.177372 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177338 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sf7pb\" (UniqueName: \"kubernetes.io/projected/d3bff263-210d-4a8a-baab-642a7254c4f8-kube-api-access-sf7pb\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.177560 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177394 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jd6x\" (UniqueName: \"kubernetes.io/projected/fe96b4e1-8214-4d66-86fa-ea8e6ddd030c-kube-api-access-8jd6x\") pod \"kube-storage-version-migrator-operator-6769c5d45-lv6ts\" (UID: \"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.177560 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177431 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xwtx\" (UniqueName: \"kubernetes.io/projected/aef3d006-e32d-47ee-b413-d3d17fa20b47-kube-api-access-8xwtx\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.177560 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177458 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2ec0ae8-afa6-40cd-943d-465f66eaed59-trusted-ca\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.177560 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177485 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqn6x\" (UniqueName: \"kubernetes.io/projected/421849b1-db63-490a-b9a2-ed853fdbfbc8-kube-api-access-pqn6x\") pod \"service-ca-operator-d6fc45fc5-clcxb\" (UID: \"421849b1-db63-490a-b9a2-ed853fdbfbc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.177560 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177517 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ec0ae8-afa6-40cd-943d-465f66eaed59-serving-cert\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.177810 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177569 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgwtc\" (UniqueName: \"kubernetes.io/projected/abcc0250-ed3e-47e6-8e0d-cd093fc5184c-kube-api-access-mgwtc\") pod \"network-check-source-8894fc9bd-wjjff\" (UID: \"abcc0250-ed3e-47e6-8e0d-cd093fc5184c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff" Apr 21 07:11:37.177810 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177602 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbznv\" (UniqueName: \"kubernetes.io/projected/68739b76-259c-44cc-ae50-18c754490061-kube-api-access-zbznv\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:37.177810 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177650 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe96b4e1-8214-4d66-86fa-ea8e6ddd030c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-lv6ts\" (UID: \"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.177810 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177714 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-stats-auth\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.177810 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177740 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.177810 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177763 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdjkr\" (UniqueName: \"kubernetes.io/projected/6711400f-c84e-4ed4-a4b1-515e9d0818a7-kube-api-access-vdjkr\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:37.177810 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177789 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:37.178229 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177818 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:37.178229 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177857 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ec0ae8-afa6-40cd-943d-465f66eaed59-config\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.178229 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177904 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/421849b1-db63-490a-b9a2-ed853fdbfbc8-config\") pod \"service-ca-operator-d6fc45fc5-clcxb\" (UID: \"421849b1-db63-490a-b9a2-ed853fdbfbc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.178229 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177931 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:37.178229 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.177975 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8b96\" (UniqueName: \"kubernetes.io/projected/f2ec0ae8-afa6-40cd-943d-465f66eaed59-kube-api-access-q8b96\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.178229 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178033 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aef3d006-e32d-47ee-b413-d3d17fa20b47-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.178229 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178067 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-default-certificate\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.178229 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.178149 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:37.178229 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.178206 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls podName:6711400f-c84e-4ed4-a4b1-515e9d0818a7 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:37.678187442 +0000 UTC m=+47.646563682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8c2r2" (UID: "6711400f-c84e-4ed4-a4b1-515e9d0818a7") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.178422 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.178467 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:37.678452645 +0000 UTC m=+47.646828866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : secret "router-metrics-certs-default" not found Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178656 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178725 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/421849b1-db63-490a-b9a2-ed853fdbfbc8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-clcxb\" (UID: \"421849b1-db63-490a-b9a2-ed853fdbfbc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178766 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178799 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe96b4e1-8214-4d66-86fa-ea8e6ddd030c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-lv6ts\" (UID: \"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178856 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aef3d006-e32d-47ee-b413-d3d17fa20b47-service-ca-bundle\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aef3d006-e32d-47ee-b413-d3d17fa20b47-serving-cert\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178915 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/aef3d006-e32d-47ee-b413-d3d17fa20b47-snapshots\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178943 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6711400f-c84e-4ed4-a4b1-515e9d0818a7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178969 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aef3d006-e32d-47ee-b413-d3d17fa20b47-tmp\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.178992 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2ec0ae8-afa6-40cd-943d-465f66eaed59-config\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.179127 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.179140 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aef3d006-e32d-47ee-b413-d3d17fa20b47-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.179978 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.179180 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:37.679163037 +0000 UTC m=+47.647539267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : configmap references non-existent config key: service-ca.crt Apr 21 07:11:37.180934 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.179261 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls podName:68739b76-259c-44cc-ae50-18c754490061 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:37.679243996 +0000 UTC m=+47.647620228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fz65p" (UID: "68739b76-259c-44cc-ae50-18c754490061") : secret "samples-operator-tls" not found Apr 21 07:11:37.180934 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.179318 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aef3d006-e32d-47ee-b413-d3d17fa20b47-tmp\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.180934 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.180117 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6711400f-c84e-4ed4-a4b1-515e9d0818a7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:37.180934 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.180120 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2ec0ae8-afa6-40cd-943d-465f66eaed59-trusted-ca\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.180934 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.180418 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/aef3d006-e32d-47ee-b413-d3d17fa20b47-snapshots\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.180934 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.180751 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aef3d006-e32d-47ee-b413-d3d17fa20b47-service-ca-bundle\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.180934 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.180803 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2ec0ae8-afa6-40cd-943d-465f66eaed59-serving-cert\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.181281 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.181001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-stats-auth\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.181929 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.181911 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aef3d006-e32d-47ee-b413-d3d17fa20b47-serving-cert\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.182242 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.182216 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-default-certificate\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.190619 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.190595 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xwtx\" (UniqueName: \"kubernetes.io/projected/aef3d006-e32d-47ee-b413-d3d17fa20b47-kube-api-access-8xwtx\") pod \"insights-operator-585dfdc468-5nqns\" (UID: \"aef3d006-e32d-47ee-b413-d3d17fa20b47\") " pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.191123 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.191103 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8b96\" (UniqueName: \"kubernetes.io/projected/f2ec0ae8-afa6-40cd-943d-465f66eaed59-kube-api-access-q8b96\") pod \"console-operator-9d4b6777b-8ckzx\" (UID: \"f2ec0ae8-afa6-40cd-943d-465f66eaed59\") " pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.191296 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.191276 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdjkr\" (UniqueName: \"kubernetes.io/projected/6711400f-c84e-4ed4-a4b1-515e9d0818a7-kube-api-access-vdjkr\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:37.191383 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.191364 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf7pb\" (UniqueName: \"kubernetes.io/projected/d3bff263-210d-4a8a-baab-642a7254c4f8-kube-api-access-sf7pb\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.191459 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.191445 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbznv\" (UniqueName: \"kubernetes.io/projected/68739b76-259c-44cc-ae50-18c754490061-kube-api-access-zbznv\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:37.210845 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.210819 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4"] Apr 21 07:11:37.214486 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:37.214457 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01991e8c_8e7b_4dbb_97df_6a5c1999f0eb.slice/crio-798227581a80f9b1cf0374ac05647dcbfce2a4528acb0b233e5665d8a0da0883 WatchSource:0}: Error finding container 798227581a80f9b1cf0374ac05647dcbfce2a4528acb0b233e5665d8a0da0883: Status 404 returned error can't find the container with id 798227581a80f9b1cf0374ac05647dcbfce2a4528acb0b233e5665d8a0da0883 Apr 21 07:11:37.237120 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.237094 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:37.249710 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.249685 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6h4bv" Apr 21 07:11:37.257323 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.257299 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5nqns" Apr 21 07:11:37.279689 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.279656 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:37.279862 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.279721 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/421849b1-db63-490a-b9a2-ed853fdbfbc8-config\") pod \"service-ca-operator-d6fc45fc5-clcxb\" (UID: \"421849b1-db63-490a-b9a2-ed853fdbfbc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.279862 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.279740 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:37.279862 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.279827 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:11:37.280014 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.279875 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert podName:b70e7d80-e8c8-44d5-8f22-7d192e037f9f nodeName:}" failed. No retries permitted until 2026-04-21 07:11:37.779861774 +0000 UTC m=+47.748237989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngssz" (UID: "b70e7d80-e8c8-44d5-8f22-7d192e037f9f") : secret "networking-console-plugin-cert" not found Apr 21 07:11:37.280271 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.280232 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/421849b1-db63-490a-b9a2-ed853fdbfbc8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-clcxb\" (UID: \"421849b1-db63-490a-b9a2-ed853fdbfbc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.280399 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.280291 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe96b4e1-8214-4d66-86fa-ea8e6ddd030c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-lv6ts\" (UID: \"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.280399 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.280367 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jd6x\" (UniqueName: \"kubernetes.io/projected/fe96b4e1-8214-4d66-86fa-ea8e6ddd030c-kube-api-access-8jd6x\") pod \"kube-storage-version-migrator-operator-6769c5d45-lv6ts\" (UID: \"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.280505 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.280405 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqn6x\" (UniqueName: \"kubernetes.io/projected/421849b1-db63-490a-b9a2-ed853fdbfbc8-kube-api-access-pqn6x\") pod \"service-ca-operator-d6fc45fc5-clcxb\" (UID: \"421849b1-db63-490a-b9a2-ed853fdbfbc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.280505 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.280450 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgwtc\" (UniqueName: \"kubernetes.io/projected/abcc0250-ed3e-47e6-8e0d-cd093fc5184c-kube-api-access-mgwtc\") pod \"network-check-source-8894fc9bd-wjjff\" (UID: \"abcc0250-ed3e-47e6-8e0d-cd093fc5184c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff" Apr 21 07:11:37.280505 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.280480 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe96b4e1-8214-4d66-86fa-ea8e6ddd030c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-lv6ts\" (UID: \"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.280985 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.280946 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe96b4e1-8214-4d66-86fa-ea8e6ddd030c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-lv6ts\" (UID: \"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.283094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.283072 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe96b4e1-8214-4d66-86fa-ea8e6ddd030c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-lv6ts\" (UID: \"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.291308 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.291257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgwtc\" (UniqueName: \"kubernetes.io/projected/abcc0250-ed3e-47e6-8e0d-cd093fc5184c-kube-api-access-mgwtc\") pod \"network-check-source-8894fc9bd-wjjff\" (UID: \"abcc0250-ed3e-47e6-8e0d-cd093fc5184c\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff" Apr 21 07:11:37.291308 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.291280 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/421849b1-db63-490a-b9a2-ed853fdbfbc8-config\") pod \"service-ca-operator-d6fc45fc5-clcxb\" (UID: \"421849b1-db63-490a-b9a2-ed853fdbfbc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.291664 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.291636 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:37.305321 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.305211 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/421849b1-db63-490a-b9a2-ed853fdbfbc8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-clcxb\" (UID: \"421849b1-db63-490a-b9a2-ed853fdbfbc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.305718 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.305663 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqn6x\" (UniqueName: \"kubernetes.io/projected/421849b1-db63-490a-b9a2-ed853fdbfbc8-kube-api-access-pqn6x\") pod \"service-ca-operator-d6fc45fc5-clcxb\" (UID: \"421849b1-db63-490a-b9a2-ed853fdbfbc8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.309104 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.309069 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jd6x\" (UniqueName: \"kubernetes.io/projected/fe96b4e1-8214-4d66-86fa-ea8e6ddd030c-kube-api-access-8jd6x\") pod \"kube-storage-version-migrator-operator-6769c5d45-lv6ts\" (UID: \"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.332314 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.331859 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" Apr 21 07:11:37.345927 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.345479 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff" Apr 21 07:11:37.388429 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.388381 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8ckzx"] Apr 21 07:11:37.392651 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:37.392611 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ec0ae8_afa6_40cd_943d_465f66eaed59.slice/crio-8c994c68172ac2a61bde521af4597eb2d3c056a45077eb9492a716b4ffc83c80 WatchSource:0}: Error finding container 8c994c68172ac2a61bde521af4597eb2d3c056a45077eb9492a716b4ffc83c80: Status 404 returned error can't find the container with id 8c994c68172ac2a61bde521af4597eb2d3c056a45077eb9492a716b4ffc83c80 Apr 21 07:11:37.409423 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.409370 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6h4bv"] Apr 21 07:11:37.423977 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.423929 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5nqns"] Apr 21 07:11:37.435112 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:37.435081 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaef3d006_e32d_47ee_b413_d3d17fa20b47.slice/crio-f6827f8cb6d79b9186be484aadb61df16f1958815310ace52929011688d65062 WatchSource:0}: Error finding container f6827f8cb6d79b9186be484aadb61df16f1958815310ace52929011688d65062: Status 404 returned error can't find the container with id f6827f8cb6d79b9186be484aadb61df16f1958815310ace52929011688d65062 Apr 21 07:11:37.489224 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.489125 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb"] Apr 21 07:11:37.492031 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:37.492004 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod421849b1_db63_490a_b9a2_ed853fdbfbc8.slice/crio-5e6facaf7fa3365a4f5b21c4f1b4d4de7291dff9a1d0ef8464e9f883d8c038eb WatchSource:0}: Error finding container 5e6facaf7fa3365a4f5b21c4f1b4d4de7291dff9a1d0ef8464e9f883d8c038eb: Status 404 returned error can't find the container with id 5e6facaf7fa3365a4f5b21c4f1b4d4de7291dff9a1d0ef8464e9f883d8c038eb Apr 21 07:11:37.509251 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.509172 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff"] Apr 21 07:11:37.511942 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:37.511903 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabcc0250_ed3e_47e6_8e0d_cd093fc5184c.slice/crio-b05478534cc8178857e877b5d54b89f7b72e49e839a5766899777f76ae767106 WatchSource:0}: Error finding container b05478534cc8178857e877b5d54b89f7b72e49e839a5766899777f76ae767106: Status 404 returned error can't find the container with id b05478534cc8178857e877b5d54b89f7b72e49e839a5766899777f76ae767106 Apr 21 07:11:37.595070 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.595020 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" Apr 21 07:11:37.685257 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.685217 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.685398 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.685273 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:37.685398 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.685378 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:11:37.685398 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.685390 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.685369749 +0000 UTC m=+48.653745967 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : configmap references non-existent config key: service-ca.crt Apr 21 07:11:37.685615 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.685424 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls podName:68739b76-259c-44cc-ae50-18c754490061 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.685413546 +0000 UTC m=+48.653789767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fz65p" (UID: "68739b76-259c-44cc-ae50-18c754490061") : secret "samples-operator-tls" not found Apr 21 07:11:37.685615 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.685420 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:37.685615 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.685474 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:37.685615 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.685483 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:11:37.685615 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.685515 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.685504971 +0000 UTC m=+48.653881185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : secret "router-metrics-certs-default" not found Apr 21 07:11:37.685615 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.685611 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:37.685890 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.685665 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls podName:6711400f-c84e-4ed4-a4b1-515e9d0818a7 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.685648593 +0000 UTC m=+48.654024812 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8c2r2" (UID: "6711400f-c84e-4ed4-a4b1-515e9d0818a7") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:37.720595 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.720564 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts"] Apr 21 07:11:37.723578 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:37.723550 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe96b4e1_8214_4d66_86fa_ea8e6ddd030c.slice/crio-de97e242c94be149100e3dcea3b2a73818286bdcdf89444896d912d44af40163 WatchSource:0}: Error finding container de97e242c94be149100e3dcea3b2a73818286bdcdf89444896d912d44af40163: Status 404 returned error can't find the container with id de97e242c94be149100e3dcea3b2a73818286bdcdf89444896d912d44af40163 Apr 21 07:11:37.786696 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.786602 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:37.787125 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.786792 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:11:37.787125 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:37.786872 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert podName:b70e7d80-e8c8-44d5-8f22-7d192e037f9f nodeName:}" failed. No retries permitted until 2026-04-21 07:11:38.786852762 +0000 UTC m=+48.755228987 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngssz" (UID: "b70e7d80-e8c8-44d5-8f22-7d192e037f9f") : secret "networking-console-plugin-cert" not found Apr 21 07:11:37.847937 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.847901 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" event={"ID":"f2ec0ae8-afa6-40cd-943d-465f66eaed59","Type":"ContainerStarted","Data":"8c994c68172ac2a61bde521af4597eb2d3c056a45077eb9492a716b4ffc83c80"} Apr 21 07:11:37.849078 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.849047 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4" event={"ID":"01991e8c-8e7b-4dbb-97df-6a5c1999f0eb","Type":"ContainerStarted","Data":"798227581a80f9b1cf0374ac05647dcbfce2a4528acb0b233e5665d8a0da0883"} Apr 21 07:11:37.850173 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.850135 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" event={"ID":"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c","Type":"ContainerStarted","Data":"de97e242c94be149100e3dcea3b2a73818286bdcdf89444896d912d44af40163"} Apr 21 07:11:37.851254 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.851229 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" event={"ID":"421849b1-db63-490a-b9a2-ed853fdbfbc8","Type":"ContainerStarted","Data":"5e6facaf7fa3365a4f5b21c4f1b4d4de7291dff9a1d0ef8464e9f883d8c038eb"} Apr 21 07:11:37.852313 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.852289 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5nqns" event={"ID":"aef3d006-e32d-47ee-b413-d3d17fa20b47","Type":"ContainerStarted","Data":"f6827f8cb6d79b9186be484aadb61df16f1958815310ace52929011688d65062"} Apr 21 07:11:37.855150 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.855097 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6h4bv" event={"ID":"a531b156-35af-430e-b636-9146320cb9f5","Type":"ContainerStarted","Data":"426d370516c72580a64e385dfe6e6ca0dbe3beea4e3857a967406fd2124399af"} Apr 21 07:11:37.856769 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:37.856749 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff" event={"ID":"abcc0250-ed3e-47e6-8e0d-cd093fc5184c","Type":"ContainerStarted","Data":"b05478534cc8178857e877b5d54b89f7b72e49e839a5766899777f76ae767106"} Apr 21 07:11:38.697131 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:38.697090 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:38.697307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:38.697152 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:38.697307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:38.697259 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:38.697307 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:38.697294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:38.697555 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.697484 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:11:38.697669 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.697566 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls podName:68739b76-259c-44cc-ae50-18c754490061 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:40.697546648 +0000 UTC m=+50.665922867 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fz65p" (UID: "68739b76-259c-44cc-ae50-18c754490061") : secret "samples-operator-tls" not found Apr 21 07:11:38.697993 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.697974 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:11:38.698069 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.698029 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:40.69801471 +0000 UTC m=+50.666390939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : secret "router-metrics-certs-default" not found Apr 21 07:11:38.698132 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.698085 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:38.698132 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.698118 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls podName:6711400f-c84e-4ed4-a4b1-515e9d0818a7 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:40.698107203 +0000 UTC m=+50.666483423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8c2r2" (UID: "6711400f-c84e-4ed4-a4b1-515e9d0818a7") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:38.698241 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.698182 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:40.698172404 +0000 UTC m=+50.666548633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : configmap references non-existent config key: service-ca.crt Apr 21 07:11:38.799077 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:38.798635 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:38.799077 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.798963 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:11:38.799077 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.799031 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert podName:b70e7d80-e8c8-44d5-8f22-7d192e037f9f nodeName:}" failed. No retries permitted until 2026-04-21 07:11:40.799006494 +0000 UTC m=+50.767382715 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngssz" (UID: "b70e7d80-e8c8-44d5-8f22-7d192e037f9f") : secret "networking-console-plugin-cert" not found Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:38.900162 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:38.900236 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:38.900294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.900450 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.900465 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c96f69849-stwdh: secret "image-registry-tls" not found Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.900539 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls podName:770fe5c5-6bbd-4902-9a64-b38c2ad3329e nodeName:}" failed. No retries permitted until 2026-04-21 07:11:54.900505547 +0000 UTC m=+64.868881779 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls") pod "image-registry-5c96f69849-stwdh" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e") : secret "image-registry-tls" not found Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.901000 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.901050 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls podName:12940af0-6363-4ae3-bd15-0431283aae9a nodeName:}" failed. No retries permitted until 2026-04-21 07:11:54.901033833 +0000 UTC m=+64.869410063 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls") pod "dns-default-qb2h5" (UID: "12940af0-6363-4ae3-bd15-0431283aae9a") : secret "dns-default-metrics-tls" not found Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.901106 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:38.901161 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:38.901135 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert podName:212502a2-9d42-4548-be4e-1a54064ecdf5 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:54.901124642 +0000 UTC m=+64.869500861 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert") pod "ingress-canary-qn9pw" (UID: "212502a2-9d42-4548-be4e-1a54064ecdf5") : secret "canary-serving-cert" not found Apr 21 07:11:40.719430 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:40.719380 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:40.719556 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:40.719576 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:40.719619 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:40.719719 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:40.719783 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:44.719765292 +0000 UTC m=+54.688141524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : secret "router-metrics-certs-default" not found Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:40.719822 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:40.719829 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:40.719850 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls podName:68739b76-259c-44cc-ae50-18c754490061 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:44.719838997 +0000 UTC m=+54.688215214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fz65p" (UID: "68739b76-259c-44cc-ae50-18c754490061") : secret "samples-operator-tls" not found Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:40.719908 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls podName:6711400f-c84e-4ed4-a4b1-515e9d0818a7 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:44.719897257 +0000 UTC m=+54.688273471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8c2r2" (UID: "6711400f-c84e-4ed4-a4b1-515e9d0818a7") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:40.719927 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:40.719923 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:44.719915156 +0000 UTC m=+54.688291375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : configmap references non-existent config key: service-ca.crt Apr 21 07:11:40.821094 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:40.821053 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:40.821291 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:40.821216 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:11:40.821346 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:40.821296 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert podName:b70e7d80-e8c8-44d5-8f22-7d192e037f9f nodeName:}" failed. No retries permitted until 2026-04-21 07:11:44.821273516 +0000 UTC m=+54.789649744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngssz" (UID: "b70e7d80-e8c8-44d5-8f22-7d192e037f9f") : secret "networking-console-plugin-cert" not found Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.755257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.755307 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.755401 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.755430 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:44.755583 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:44.755636 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls podName:68739b76-259c-44cc-ae50-18c754490061 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:52.755617395 +0000 UTC m=+62.723993624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fz65p" (UID: "68739b76-259c-44cc-ae50-18c754490061") : secret "samples-operator-tls" not found Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:44.755694 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:44.755723 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:52.755713217 +0000 UTC m=+62.724089437 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : secret "router-metrics-certs-default" not found Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:44.755772 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:44.755802 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls podName:6711400f-c84e-4ed4-a4b1-515e9d0818a7 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:52.755792474 +0000 UTC m=+62.724168693 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8c2r2" (UID: "6711400f-c84e-4ed4-a4b1-515e9d0818a7") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:44.755923 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:44.755874 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:11:52.755865245 +0000 UTC m=+62.724241473 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : configmap references non-existent config key: service-ca.crt Apr 21 07:11:44.857327 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.856646 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:44.857327 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:44.856901 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:11:44.857327 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:44.856964 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert podName:b70e7d80-e8c8-44d5-8f22-7d192e037f9f nodeName:}" failed. No retries permitted until 2026-04-21 07:11:52.856944714 +0000 UTC m=+62.825320934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngssz" (UID: "b70e7d80-e8c8-44d5-8f22-7d192e037f9f") : secret "networking-console-plugin-cert" not found Apr 21 07:11:44.877250 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.877190 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4" event={"ID":"01991e8c-8e7b-4dbb-97df-6a5c1999f0eb","Type":"ContainerStarted","Data":"fa17eccab2ef4c6b27b614bbcd0471b2216d3e1eecde0ccf10333b8701694e97"} Apr 21 07:11:44.878961 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.878925 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" event={"ID":"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c","Type":"ContainerStarted","Data":"84afa20e9ce1632a8f75d01f66f9fc6849afe24e302634dd78a71e565698986e"} Apr 21 07:11:44.880354 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.880332 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" event={"ID":"421849b1-db63-490a-b9a2-ed853fdbfbc8","Type":"ContainerStarted","Data":"5c5a34934d08ee01e9ab47dd5f6d53d4d69c5b59299444b39b2a0a086e54c62a"} Apr 21 07:11:44.887475 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.887433 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5nqns" event={"ID":"aef3d006-e32d-47ee-b413-d3d17fa20b47","Type":"ContainerStarted","Data":"2e7d0381479a28e25e2a3f0c9ae08947ed17132c77652cb8e62f8e6809c9144b"} Apr 21 07:11:44.926213 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.926149 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" podStartSLOduration=2.047152038 podStartE2EDuration="8.926133424s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="2026-04-21 07:11:37.725390329 +0000 UTC m=+47.693766544" lastFinishedPulling="2026-04-21 07:11:44.604371716 +0000 UTC m=+54.572747930" observedRunningTime="2026-04-21 07:11:44.924895374 +0000 UTC m=+54.893271611" watchObservedRunningTime="2026-04-21 07:11:44.926133424 +0000 UTC m=+54.894509661" Apr 21 07:11:44.987587 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:44.985622 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" podStartSLOduration=1.8750886869999999 podStartE2EDuration="8.985601707s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="2026-04-21 07:11:37.493860016 +0000 UTC m=+47.462236230" lastFinishedPulling="2026-04-21 07:11:44.604373032 +0000 UTC m=+54.572749250" observedRunningTime="2026-04-21 07:11:44.953929636 +0000 UTC m=+54.922305873" watchObservedRunningTime="2026-04-21 07:11:44.985601707 +0000 UTC m=+54.953977949" Apr 21 07:11:45.022018 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.020907 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-l47f4" podStartSLOduration=1.732834828 podStartE2EDuration="9.02088641s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="2026-04-21 07:11:37.216325495 +0000 UTC m=+47.184701710" lastFinishedPulling="2026-04-21 07:11:44.504377074 +0000 UTC m=+54.472753292" observedRunningTime="2026-04-21 07:11:45.018648322 +0000 UTC m=+54.987024559" watchObservedRunningTime="2026-04-21 07:11:45.02088641 +0000 UTC m=+54.989262647" Apr 21 07:11:45.022018 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.021002 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-5nqns" podStartSLOduration=1.879858724 podStartE2EDuration="9.020996141s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="2026-04-21 07:11:37.438463895 +0000 UTC m=+47.406840126" lastFinishedPulling="2026-04-21 07:11:44.579601315 +0000 UTC m=+54.547977543" observedRunningTime="2026-04-21 07:11:44.987403542 +0000 UTC m=+54.955779779" watchObservedRunningTime="2026-04-21 07:11:45.020996141 +0000 UTC m=+54.989372381" Apr 21 07:11:45.891827 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.891779 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6h4bv" event={"ID":"a531b156-35af-430e-b636-9146320cb9f5","Type":"ContainerStarted","Data":"928cae28306ecb8ced8ad4ea6356815b73c78bc7507d21762eee4261c7a1fbf2"} Apr 21 07:11:45.893246 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.893203 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff" event={"ID":"abcc0250-ed3e-47e6-8e0d-cd093fc5184c","Type":"ContainerStarted","Data":"0226e3e2339f870411ac871b2a53c8c70a4ed56ea2d373d9887b1ced3af6307a"} Apr 21 07:11:45.894796 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.894775 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ckzx_f2ec0ae8-afa6-40cd-943d-465f66eaed59/console-operator/0.log" Apr 21 07:11:45.894913 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.894813 2567 generic.go:358] "Generic (PLEG): container finished" podID="f2ec0ae8-afa6-40cd-943d-465f66eaed59" containerID="d23aa6046340482be8ff414122950e971a13d45e11b1b38585d364aaad64cdbc" exitCode=255 Apr 21 07:11:45.894979 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.894911 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" event={"ID":"f2ec0ae8-afa6-40cd-943d-465f66eaed59","Type":"ContainerDied","Data":"d23aa6046340482be8ff414122950e971a13d45e11b1b38585d364aaad64cdbc"} Apr 21 07:11:45.895103 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.895085 2567 scope.go:117] "RemoveContainer" containerID="d23aa6046340482be8ff414122950e971a13d45e11b1b38585d364aaad64cdbc" Apr 21 07:11:45.914334 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.914288 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6h4bv" podStartSLOduration=33.465301265 podStartE2EDuration="40.914274709s" podCreationTimestamp="2026-04-21 07:11:05 +0000 UTC" firstStartedPulling="2026-04-21 07:11:37.421345947 +0000 UTC m=+47.389722161" lastFinishedPulling="2026-04-21 07:11:44.870319376 +0000 UTC m=+54.838695605" observedRunningTime="2026-04-21 07:11:45.912905855 +0000 UTC m=+55.881282093" watchObservedRunningTime="2026-04-21 07:11:45.914274709 +0000 UTC m=+55.882650945" Apr 21 07:11:45.941688 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:45.941628 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wjjff" podStartSLOduration=2.842627742 podStartE2EDuration="9.941611262s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="2026-04-21 07:11:37.514199529 +0000 UTC m=+47.482575742" lastFinishedPulling="2026-04-21 07:11:44.613183049 +0000 UTC m=+54.581559262" observedRunningTime="2026-04-21 07:11:45.941460745 +0000 UTC m=+55.909836984" watchObservedRunningTime="2026-04-21 07:11:45.941611262 +0000 UTC m=+55.909987500" Apr 21 07:11:46.902644 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:46.902615 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ckzx_f2ec0ae8-afa6-40cd-943d-465f66eaed59/console-operator/1.log" Apr 21 07:11:46.903110 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:46.903017 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ckzx_f2ec0ae8-afa6-40cd-943d-465f66eaed59/console-operator/0.log" Apr 21 07:11:46.903110 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:46.903060 2567 generic.go:358] "Generic (PLEG): container finished" podID="f2ec0ae8-afa6-40cd-943d-465f66eaed59" containerID="f3a7d2df902d450d0b73a604056a826b5dbf1aeddfea99e0d583a696a06d5b65" exitCode=255 Apr 21 07:11:46.903228 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:46.903129 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" event={"ID":"f2ec0ae8-afa6-40cd-943d-465f66eaed59","Type":"ContainerDied","Data":"f3a7d2df902d450d0b73a604056a826b5dbf1aeddfea99e0d583a696a06d5b65"} Apr 21 07:11:46.903228 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:46.903179 2567 scope.go:117] "RemoveContainer" containerID="d23aa6046340482be8ff414122950e971a13d45e11b1b38585d364aaad64cdbc" Apr 21 07:11:46.903580 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:46.903558 2567 scope.go:117] "RemoveContainer" containerID="f3a7d2df902d450d0b73a604056a826b5dbf1aeddfea99e0d583a696a06d5b65" Apr 21 07:11:46.903773 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:46.903745 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ckzx_openshift-console-operator(f2ec0ae8-afa6-40cd-943d-465f66eaed59)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" podUID="f2ec0ae8-afa6-40cd-943d-465f66eaed59" Apr 21 07:11:47.237489 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:47.237396 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:47.237489 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:47.237438 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:11:47.790101 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:47.790076 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pbdvz" Apr 21 07:11:47.907451 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:47.907425 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ckzx_f2ec0ae8-afa6-40cd-943d-465f66eaed59/console-operator/1.log" Apr 21 07:11:47.907839 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:47.907767 2567 scope.go:117] "RemoveContainer" containerID="f3a7d2df902d450d0b73a604056a826b5dbf1aeddfea99e0d583a696a06d5b65" Apr 21 07:11:47.907957 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:47.907939 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ckzx_openshift-console-operator(f2ec0ae8-afa6-40cd-943d-465f66eaed59)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" podUID="f2ec0ae8-afa6-40cd-943d-465f66eaed59" Apr 21 07:11:48.757844 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:48.757817 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qzpkj_64fd83cc-7ef6-4cb9-892b-0111cac9771d/dns-node-resolver/0.log" Apr 21 07:11:48.911149 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:48.911121 2567 scope.go:117] "RemoveContainer" containerID="f3a7d2df902d450d0b73a604056a826b5dbf1aeddfea99e0d583a696a06d5b65" Apr 21 07:11:48.911504 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:48.911293 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8ckzx_openshift-console-operator(f2ec0ae8-afa6-40cd-943d-465f66eaed59)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" podUID="f2ec0ae8-afa6-40cd-943d-465f66eaed59" Apr 21 07:11:49.360196 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:49.360163 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7vkh9_8105a8f5-e174-49e3-ba2e-c9e8b7d649a4/node-ca/0.log" Apr 21 07:11:50.960539 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:50.960500 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-lv6ts_fe96b4e1-8214-4d66-86fa-ea8e6ddd030c/kube-storage-version-migrator-operator/0.log" Apr 21 07:11:52.838759 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:52.838724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:52.838759 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:52.838763 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:11:52.839313 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:52.838828 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:11:52.839313 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:52.838854 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:11:52.839313 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:52.838926 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:08.838898026 +0000 UTC m=+78.807274260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : configmap references non-existent config key: service-ca.crt Apr 21 07:11:52.839313 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:52.838967 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:52.839313 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:52.838968 2567 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:11:52.839313 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:52.839011 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:11:52.839313 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:52.839015 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls podName:6711400f-c84e-4ed4-a4b1-515e9d0818a7 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:08.839003603 +0000 UTC m=+78.807379817 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8c2r2" (UID: "6711400f-c84e-4ed4-a4b1-515e9d0818a7") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:11:52.839313 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:52.839140 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls podName:68739b76-259c-44cc-ae50-18c754490061 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:08.839116232 +0000 UTC m=+78.807492448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fz65p" (UID: "68739b76-259c-44cc-ae50-18c754490061") : secret "samples-operator-tls" not found Apr 21 07:11:52.839313 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:52.839153 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs podName:d3bff263-210d-4a8a-baab-642a7254c4f8 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:08.839147059 +0000 UTC m=+78.807523273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs") pod "router-default-6965f7486b-s6vrq" (UID: "d3bff263-210d-4a8a-baab-642a7254c4f8") : secret "router-metrics-certs-default" not found Apr 21 07:11:52.940143 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:52.940109 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:11:52.940311 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:52.940260 2567 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 21 07:11:52.940360 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:52.940326 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert podName:b70e7d80-e8c8-44d5-8f22-7d192e037f9f nodeName:}" failed. No retries permitted until 2026-04-21 07:12:08.940309742 +0000 UTC m=+78.908685957 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ngssz" (UID: "b70e7d80-e8c8-44d5-8f22-7d192e037f9f") : secret "networking-console-plugin-cert" not found Apr 21 07:11:54.956543 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:54.956485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:11:54.956980 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:54.956629 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:11:54.956980 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:54.956631 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 07:11:54.956980 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:54.956652 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c96f69849-stwdh: secret "image-registry-tls" not found Apr 21 07:11:54.956980 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:54.956658 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:11:54.956980 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:54.956702 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls podName:770fe5c5-6bbd-4902-9a64-b38c2ad3329e nodeName:}" failed. No retries permitted until 2026-04-21 07:12:26.956685227 +0000 UTC m=+96.925061441 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls") pod "image-registry-5c96f69849-stwdh" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e") : secret "image-registry-tls" not found Apr 21 07:11:54.956980 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:54.956752 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:11:54.956980 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:54.956756 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:11:54.956980 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:54.956794 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert podName:212502a2-9d42-4548-be4e-1a54064ecdf5 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:26.956782542 +0000 UTC m=+96.925158756 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert") pod "ingress-canary-qn9pw" (UID: "212502a2-9d42-4548-be4e-1a54064ecdf5") : secret "canary-serving-cert" not found Apr 21 07:11:54.956980 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:54.956810 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls podName:12940af0-6363-4ae3-bd15-0431283aae9a nodeName:}" failed. No retries permitted until 2026-04-21 07:12:26.956798331 +0000 UTC m=+96.925174545 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls") pod "dns-default-qb2h5" (UID: "12940af0-6363-4ae3-bd15-0431283aae9a") : secret "dns-default-metrics-tls" not found Apr 21 07:11:56.268091 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.268051 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:11:56.270498 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.270476 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:11:56.278286 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:56.278263 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 07:11:56.278397 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:11:56.278341 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs podName:6d1ac31b-8866-4817-8119-87e810a0da44 nodeName:}" failed. No retries permitted until 2026-04-21 07:13:00.278319415 +0000 UTC m=+130.246695629 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs") pod "network-metrics-daemon-xxrlv" (UID: "6d1ac31b-8866-4817-8119-87e810a0da44") : secret "metrics-daemon-secret" not found Apr 21 07:11:56.369161 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.369121 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:56.371552 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.371512 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz4vs\" (UniqueName: \"kubernetes.io/projected/39152450-b5d7-466f-b0a7-58dad042db38-kube-api-access-hz4vs\") pod \"network-check-target-fsfmp\" (UID: \"39152450-b5d7-466f-b0a7-58dad042db38\") " pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:56.469545 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.469502 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7djql\"" Apr 21 07:11:56.477670 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.477645 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:56.617706 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.617577 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fsfmp"] Apr 21 07:11:56.620367 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:11:56.620331 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39152450_b5d7_466f_b0a7_58dad042db38.slice/crio-597b24345f695aec7e0751c41f32acafc78ae4ec9892cdf8352b1ac43d78f299 WatchSource:0}: Error finding container 597b24345f695aec7e0751c41f32acafc78ae4ec9892cdf8352b1ac43d78f299: Status 404 returned error can't find the container with id 597b24345f695aec7e0751c41f32acafc78ae4ec9892cdf8352b1ac43d78f299 Apr 21 07:11:56.932055 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.932019 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fsfmp" event={"ID":"39152450-b5d7-466f-b0a7-58dad042db38","Type":"ContainerStarted","Data":"eaa576eea05b3a451b235826b863aa0fd3e52edad6ad31dc5ca8422541a51cda"} Apr 21 07:11:56.932055 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.932057 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fsfmp" event={"ID":"39152450-b5d7-466f-b0a7-58dad042db38","Type":"ContainerStarted","Data":"597b24345f695aec7e0751c41f32acafc78ae4ec9892cdf8352b1ac43d78f299"} Apr 21 07:11:56.932268 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.932087 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:11:56.948245 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:11:56.948181 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fsfmp" podStartSLOduration=65.948167319 podStartE2EDuration="1m5.948167319s" podCreationTimestamp="2026-04-21 07:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:11:56.947896812 +0000 UTC m=+66.916273049" watchObservedRunningTime="2026-04-21 07:11:56.948167319 +0000 UTC m=+66.916543555" Apr 21 07:12:01.631318 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:01.631285 2567 scope.go:117] "RemoveContainer" containerID="f3a7d2df902d450d0b73a604056a826b5dbf1aeddfea99e0d583a696a06d5b65" Apr 21 07:12:01.946669 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:01.946593 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ckzx_f2ec0ae8-afa6-40cd-943d-465f66eaed59/console-operator/1.log" Apr 21 07:12:01.946802 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:01.946675 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" event={"ID":"f2ec0ae8-afa6-40cd-943d-465f66eaed59","Type":"ContainerStarted","Data":"3db9435d218e7e3e3c2014572bd482c7ed10780af15b98e72309ddfc04aa53ee"} Apr 21 07:12:01.947021 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:01.947003 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:12:01.965786 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:01.965740 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" podStartSLOduration=18.758109561 podStartE2EDuration="25.965726661s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="2026-04-21 07:11:37.396822032 +0000 UTC m=+47.365198253" lastFinishedPulling="2026-04-21 07:11:44.604439127 +0000 UTC m=+54.572815353" observedRunningTime="2026-04-21 07:12:01.964631308 +0000 UTC m=+71.933007573" watchObservedRunningTime="2026-04-21 07:12:01.965726661 +0000 UTC m=+71.934102897" Apr 21 07:12:02.406947 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:02.406919 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-8ckzx" Apr 21 07:12:08.878019 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.877979 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:12:08.878019 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.878029 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:12:08.878464 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.878155 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:12:08.878464 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.878179 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:12:08.878869 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.878844 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3bff263-210d-4a8a-baab-642a7254c4f8-service-ca-bundle\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:12:08.880542 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.880496 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/68739b76-259c-44cc-ae50-18c754490061-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fz65p\" (UID: \"68739b76-259c-44cc-ae50-18c754490061\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:12:08.880650 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.880632 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d3bff263-210d-4a8a-baab-642a7254c4f8-metrics-certs\") pod \"router-default-6965f7486b-s6vrq\" (UID: \"d3bff263-210d-4a8a-baab-642a7254c4f8\") " pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:12:08.880691 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.880648 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6711400f-c84e-4ed4-a4b1-515e9d0818a7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8c2r2\" (UID: \"6711400f-c84e-4ed4-a4b1-515e9d0818a7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:12:08.979144 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.979110 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:12:08.981385 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:08.981359 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b70e7d80-e8c8-44d5-8f22-7d192e037f9f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ngssz\" (UID: \"b70e7d80-e8c8-44d5-8f22-7d192e037f9f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:12:09.024318 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.024287 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-srwvp\"" Apr 21 07:12:09.026925 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.026907 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-x6bq9\"" Apr 21 07:12:09.028812 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.028796 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" Apr 21 07:12:09.034489 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.034467 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" Apr 21 07:12:09.074816 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.074785 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-x4czj\"" Apr 21 07:12:09.083087 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.082609 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:12:09.119417 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.119394 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-44s8p\"" Apr 21 07:12:09.119628 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.119592 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" Apr 21 07:12:09.128225 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.128138 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xcdds"] Apr 21 07:12:09.140400 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.140359 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.155121 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.155096 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-9kvhm\"" Apr 21 07:12:09.155273 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.155251 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 07:12:09.155422 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.155406 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 07:12:09.189211 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.189169 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xcdds"] Apr 21 07:12:09.223631 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.223599 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2"] Apr 21 07:12:09.225793 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:12:09.225767 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6711400f_c84e_4ed4_a4b1_515e9d0818a7.slice/crio-0652809ed5c68172e204885c32918604642a71172307bfc7ce3ac5f242d06235 WatchSource:0}: Error finding container 0652809ed5c68172e204885c32918604642a71172307bfc7ce3ac5f242d06235: Status 404 returned error can't find the container with id 0652809ed5c68172e204885c32918604642a71172307bfc7ce3ac5f242d06235 Apr 21 07:12:09.241449 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.241404 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p"] Apr 21 07:12:09.282160 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.282138 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd692286-feae-4dbc-8b70-cb2a424e2cec-data-volume\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.282260 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.282196 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd692286-feae-4dbc-8b70-cb2a424e2cec-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.282315 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.282278 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvbhs\" (UniqueName: \"kubernetes.io/projected/dd692286-feae-4dbc-8b70-cb2a424e2cec-kube-api-access-lvbhs\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.282371 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.282349 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd692286-feae-4dbc-8b70-cb2a424e2cec-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.282468 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.282450 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd692286-feae-4dbc-8b70-cb2a424e2cec-crio-socket\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.297778 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.297754 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6965f7486b-s6vrq"] Apr 21 07:12:09.303908 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:12:09.303883 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3bff263_210d_4a8a_baab_642a7254c4f8.slice/crio-2459d39f98ee150b38c507e21c358fbc4b53b05d80af874eb6167fe0acf0f511 WatchSource:0}: Error finding container 2459d39f98ee150b38c507e21c358fbc4b53b05d80af874eb6167fe0acf0f511: Status 404 returned error can't find the container with id 2459d39f98ee150b38c507e21c358fbc4b53b05d80af874eb6167fe0acf0f511 Apr 21 07:12:09.325990 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.325964 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ngssz"] Apr 21 07:12:09.334616 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:12:09.334584 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb70e7d80_e8c8_44d5_8f22_7d192e037f9f.slice/crio-115fe299d7f4aa03793f16db1036bb6b8d283fe08fafa450abe4870bb625862a WatchSource:0}: Error finding container 115fe299d7f4aa03793f16db1036bb6b8d283fe08fafa450abe4870bb625862a: Status 404 returned error can't find the container with id 115fe299d7f4aa03793f16db1036bb6b8d283fe08fafa450abe4870bb625862a Apr 21 07:12:09.383189 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.383092 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd692286-feae-4dbc-8b70-cb2a424e2cec-crio-socket\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.383189 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.383161 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd692286-feae-4dbc-8b70-cb2a424e2cec-data-volume\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.383189 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.383187 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd692286-feae-4dbc-8b70-cb2a424e2cec-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.383405 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.383210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dd692286-feae-4dbc-8b70-cb2a424e2cec-crio-socket\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.383405 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.383266 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvbhs\" (UniqueName: \"kubernetes.io/projected/dd692286-feae-4dbc-8b70-cb2a424e2cec-kube-api-access-lvbhs\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.383405 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.383316 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd692286-feae-4dbc-8b70-cb2a424e2cec-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.383584 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.383554 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dd692286-feae-4dbc-8b70-cb2a424e2cec-data-volume\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.384039 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.384019 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dd692286-feae-4dbc-8b70-cb2a424e2cec-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.385406 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.385385 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dd692286-feae-4dbc-8b70-cb2a424e2cec-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.402190 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.402165 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvbhs\" (UniqueName: \"kubernetes.io/projected/dd692286-feae-4dbc-8b70-cb2a424e2cec-kube-api-access-lvbhs\") pod \"insights-runtime-extractor-xcdds\" (UID: \"dd692286-feae-4dbc-8b70-cb2a424e2cec\") " pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.455782 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.455741 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xcdds" Apr 21 07:12:09.603013 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.602981 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xcdds"] Apr 21 07:12:09.607305 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:12:09.607271 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd692286_feae_4dbc_8b70_cb2a424e2cec.slice/crio-72b4af827963044fdc131bf05811f5181829d6040089a6095c7a4288e2314e9f WatchSource:0}: Error finding container 72b4af827963044fdc131bf05811f5181829d6040089a6095c7a4288e2314e9f: Status 404 returned error can't find the container with id 72b4af827963044fdc131bf05811f5181829d6040089a6095c7a4288e2314e9f Apr 21 07:12:09.969768 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.969678 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" event={"ID":"b70e7d80-e8c8-44d5-8f22-7d192e037f9f","Type":"ContainerStarted","Data":"115fe299d7f4aa03793f16db1036bb6b8d283fe08fafa450abe4870bb625862a"} Apr 21 07:12:09.971089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.971038 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" event={"ID":"6711400f-c84e-4ed4-a4b1-515e9d0818a7","Type":"ContainerStarted","Data":"0652809ed5c68172e204885c32918604642a71172307bfc7ce3ac5f242d06235"} Apr 21 07:12:09.973518 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.973465 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6965f7486b-s6vrq" event={"ID":"d3bff263-210d-4a8a-baab-642a7254c4f8","Type":"ContainerStarted","Data":"9d7e2991408cf798ad035c824897bffbf66ac893e27fefb4c9a6cb1a6874df60"} Apr 21 07:12:09.973518 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.973498 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6965f7486b-s6vrq" event={"ID":"d3bff263-210d-4a8a-baab-642a7254c4f8","Type":"ContainerStarted","Data":"2459d39f98ee150b38c507e21c358fbc4b53b05d80af874eb6167fe0acf0f511"} Apr 21 07:12:09.975393 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.975344 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xcdds" event={"ID":"dd692286-feae-4dbc-8b70-cb2a424e2cec","Type":"ContainerStarted","Data":"f9ecb5ba5c7592a085729b50c2a857eec8607627abbfa08e3535eca9fc6c42ee"} Apr 21 07:12:09.975393 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.975375 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xcdds" event={"ID":"dd692286-feae-4dbc-8b70-cb2a424e2cec","Type":"ContainerStarted","Data":"72b4af827963044fdc131bf05811f5181829d6040089a6095c7a4288e2314e9f"} Apr 21 07:12:09.976646 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:09.976622 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" event={"ID":"68739b76-259c-44cc-ae50-18c754490061","Type":"ContainerStarted","Data":"f7e7a7532ed2c5bc5a7427da7829272b2be96142b406e6f919caf466b8700638"} Apr 21 07:12:10.083239 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:10.083106 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:12:10.085887 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:10.085856 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:12:10.121515 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:10.121463 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6965f7486b-s6vrq" podStartSLOduration=34.121449492 podStartE2EDuration="34.121449492s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:12:10.001683473 +0000 UTC m=+79.970059747" watchObservedRunningTime="2026-04-21 07:12:10.121449492 +0000 UTC m=+80.089825728" Apr 21 07:12:10.980276 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:10.980236 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xcdds" event={"ID":"dd692286-feae-4dbc-8b70-cb2a424e2cec","Type":"ContainerStarted","Data":"1182a87cdd05ad12c6e565ea1c5b4b6bc58407b3c5b4a33aaf086055f50d2642"} Apr 21 07:12:10.980597 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:10.980549 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:12:10.981742 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:10.981722 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6965f7486b-s6vrq" Apr 21 07:12:13.992045 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:13.992011 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" event={"ID":"b70e7d80-e8c8-44d5-8f22-7d192e037f9f","Type":"ContainerStarted","Data":"e9b01c88cd981dd23c4136ca3e8df66d52e5fbc23e95910fba35d72ddfe44c18"} Apr 21 07:12:13.993429 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:13.993404 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" event={"ID":"6711400f-c84e-4ed4-a4b1-515e9d0818a7","Type":"ContainerStarted","Data":"695c0e91aebac771aa05ba63056e44b023d2feffaabadb7c77ec30fe8b7636cc"} Apr 21 07:12:13.995228 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:13.995200 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xcdds" event={"ID":"dd692286-feae-4dbc-8b70-cb2a424e2cec","Type":"ContainerStarted","Data":"2a638325613dbbf4731705c2060059c8ca987055362985a47dca8dd7bb780b34"} Apr 21 07:12:13.996750 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:13.996726 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" event={"ID":"68739b76-259c-44cc-ae50-18c754490061","Type":"ContainerStarted","Data":"eaa97f7ca2c71e87c152a2992a1001355f8ab7c318bd1225b52541b8288f408b"} Apr 21 07:12:13.996750 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:13.996753 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" event={"ID":"68739b76-259c-44cc-ae50-18c754490061","Type":"ContainerStarted","Data":"0356738dff168997611acbfb2f0872c4357d096d71d7d78b0fc5e38900dff442"} Apr 21 07:12:14.020365 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:14.017962 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ngssz" podStartSLOduration=34.340654892 podStartE2EDuration="38.017942971s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="2026-04-21 07:12:09.336566933 +0000 UTC m=+79.304943147" lastFinishedPulling="2026-04-21 07:12:13.013855011 +0000 UTC m=+82.982231226" observedRunningTime="2026-04-21 07:12:14.016197825 +0000 UTC m=+83.984574062" watchObservedRunningTime="2026-04-21 07:12:14.017942971 +0000 UTC m=+83.986319208" Apr 21 07:12:14.050950 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:14.050902 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fz65p" podStartSLOduration=34.282317395 podStartE2EDuration="38.050889769s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="2026-04-21 07:12:09.274179088 +0000 UTC m=+79.242555302" lastFinishedPulling="2026-04-21 07:12:13.042751461 +0000 UTC m=+83.011127676" observedRunningTime="2026-04-21 07:12:14.050702647 +0000 UTC m=+84.019078883" watchObservedRunningTime="2026-04-21 07:12:14.050889769 +0000 UTC m=+84.019266005" Apr 21 07:12:14.107428 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:14.107383 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8c2r2" podStartSLOduration=34.298154953 podStartE2EDuration="38.107371296s" podCreationTimestamp="2026-04-21 07:11:36 +0000 UTC" firstStartedPulling="2026-04-21 07:12:09.228097421 +0000 UTC m=+79.196473635" lastFinishedPulling="2026-04-21 07:12:13.037313753 +0000 UTC m=+83.005689978" observedRunningTime="2026-04-21 07:12:14.107000606 +0000 UTC m=+84.075376842" watchObservedRunningTime="2026-04-21 07:12:14.107371296 +0000 UTC m=+84.075747532" Apr 21 07:12:14.107827 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:14.107804 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xcdds" podStartSLOduration=1.370052402 podStartE2EDuration="5.107797535s" podCreationTimestamp="2026-04-21 07:12:09 +0000 UTC" firstStartedPulling="2026-04-21 07:12:09.667083643 +0000 UTC m=+79.635459857" lastFinishedPulling="2026-04-21 07:12:13.40482876 +0000 UTC m=+83.373204990" observedRunningTime="2026-04-21 07:12:14.073292287 +0000 UTC m=+84.041668523" watchObservedRunningTime="2026-04-21 07:12:14.107797535 +0000 UTC m=+84.076173841" Apr 21 07:12:22.205303 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.205268 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bw5vh"] Apr 21 07:12:22.210326 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.210299 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.213098 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.213068 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k9dbk\"" Apr 21 07:12:22.213252 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.213161 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 07:12:22.213252 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.213176 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 07:12:22.213379 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.213298 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 07:12:22.213679 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.213662 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 07:12:22.294657 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.294615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-root\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.294657 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.294655 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-wtmp\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.294870 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.294678 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-metrics-client-ca\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.294870 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.294752 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.294870 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.294801 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-textfile\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.294870 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.294850 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-tls\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.294870 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.294870 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cmr2\" (UniqueName: \"kubernetes.io/projected/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-kube-api-access-5cmr2\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.295022 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.294891 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-sys\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.295022 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.294929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-accelerators-collector-config\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.395922 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.395882 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.395922 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.395926 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-textfile\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396133 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.395970 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-tls\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396133 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.395987 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cmr2\" (UniqueName: \"kubernetes.io/projected/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-kube-api-access-5cmr2\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396133 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396012 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-sys\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396133 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-accelerators-collector-config\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396133 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:12:22.396115 2567 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 07:12:22.396362 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396128 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-sys\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396362 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396152 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-root\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396362 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396115 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-root\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396362 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:12:22.396195 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-tls podName:6eb95f88-f13e-40eb-96e2-e5af7eff2dc9 nodeName:}" failed. No retries permitted until 2026-04-21 07:12:22.896174311 +0000 UTC m=+92.864550544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-tls") pod "node-exporter-bw5vh" (UID: "6eb95f88-f13e-40eb-96e2-e5af7eff2dc9") : secret "node-exporter-tls" not found Apr 21 07:12:22.396362 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396281 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-wtmp\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396362 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396319 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-metrics-client-ca\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396629 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396441 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-wtmp\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396629 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396444 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-textfile\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396710 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396693 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-accelerators-collector-config\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.396906 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.396886 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-metrics-client-ca\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.398320 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.398302 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.407555 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.407516 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cmr2\" (UniqueName: \"kubernetes.io/projected/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-kube-api-access-5cmr2\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.900993 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.900951 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-tls\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:22.903464 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:22.903434 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6eb95f88-f13e-40eb-96e2-e5af7eff2dc9-node-exporter-tls\") pod \"node-exporter-bw5vh\" (UID: \"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9\") " pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:23.120444 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:23.120407 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bw5vh" Apr 21 07:12:23.129718 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:12:23.129685 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb95f88_f13e_40eb_96e2_e5af7eff2dc9.slice/crio-b849cff5f3a3ec7586a274be0836d17049e48d6d0ec2050700d758228c9f9cec WatchSource:0}: Error finding container b849cff5f3a3ec7586a274be0836d17049e48d6d0ec2050700d758228c9f9cec: Status 404 returned error can't find the container with id b849cff5f3a3ec7586a274be0836d17049e48d6d0ec2050700d758228c9f9cec Apr 21 07:12:24.025002 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:24.024968 2567 generic.go:358] "Generic (PLEG): container finished" podID="6eb95f88-f13e-40eb-96e2-e5af7eff2dc9" containerID="e2f615f60ed0d4d6c9f5848f53e688e979b63231538aa9c2a5b3cfa7403b8787" exitCode=0 Apr 21 07:12:24.025367 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:24.025016 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bw5vh" event={"ID":"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9","Type":"ContainerDied","Data":"e2f615f60ed0d4d6c9f5848f53e688e979b63231538aa9c2a5b3cfa7403b8787"} Apr 21 07:12:24.025367 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:24.025058 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bw5vh" event={"ID":"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9","Type":"ContainerStarted","Data":"b849cff5f3a3ec7586a274be0836d17049e48d6d0ec2050700d758228c9f9cec"} Apr 21 07:12:25.029589 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:25.029518 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bw5vh" event={"ID":"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9","Type":"ContainerStarted","Data":"de9261a4e1a3306abe766ca2c41cdb719529230d7a9edbdd9bef02f1f3e82c4e"} Apr 21 07:12:25.029589 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:25.029593 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bw5vh" event={"ID":"6eb95f88-f13e-40eb-96e2-e5af7eff2dc9","Type":"ContainerStarted","Data":"892d8ffce9f7ec69f7c803f7a4255731e4ef4d285faae6433200af644a45813b"} Apr 21 07:12:25.053814 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:25.053760 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bw5vh" podStartSLOduration=2.400100351 podStartE2EDuration="3.053743785s" podCreationTimestamp="2026-04-21 07:12:22 +0000 UTC" firstStartedPulling="2026-04-21 07:12:23.131596151 +0000 UTC m=+93.099972366" lastFinishedPulling="2026-04-21 07:12:23.785239568 +0000 UTC m=+93.753615800" observedRunningTime="2026-04-21 07:12:25.052588865 +0000 UTC m=+95.020965102" watchObservedRunningTime="2026-04-21 07:12:25.053743785 +0000 UTC m=+95.022120020" Apr 21 07:12:27.039118 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.039073 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:12:27.039626 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.039127 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:12:27.039626 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.039163 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:12:27.041698 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.041665 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/12940af0-6363-4ae3-bd15-0431283aae9a-metrics-tls\") pod \"dns-default-qb2h5\" (UID: \"12940af0-6363-4ae3-bd15-0431283aae9a\") " pod="openshift-dns/dns-default-qb2h5" Apr 21 07:12:27.041812 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.041715 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"image-registry-5c96f69849-stwdh\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:12:27.041871 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.041851 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/212502a2-9d42-4548-be4e-1a54064ecdf5-cert\") pod \"ingress-canary-qn9pw\" (UID: \"212502a2-9d42-4548-be4e-1a54064ecdf5\") " pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:12:27.200799 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.200766 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sgb5p\"" Apr 21 07:12:27.208686 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.208662 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:12:27.253208 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.253173 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vr4v6\"" Apr 21 07:12:27.260823 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.260794 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qn9pw" Apr 21 07:12:27.263320 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.263298 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-b24jv\"" Apr 21 07:12:27.272263 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.271787 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qb2h5" Apr 21 07:12:27.346672 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.346621 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c96f69849-stwdh"] Apr 21 07:12:27.352678 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:12:27.352647 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod770fe5c5_6bbd_4902_9a64_b38c2ad3329e.slice/crio-50d54e0301bc4c1513c27698c697d88851f8787fa88b269aa09d279623cefbe2 WatchSource:0}: Error finding container 50d54e0301bc4c1513c27698c697d88851f8787fa88b269aa09d279623cefbe2: Status 404 returned error can't find the container with id 50d54e0301bc4c1513c27698c697d88851f8787fa88b269aa09d279623cefbe2 Apr 21 07:12:27.452306 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.452190 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qb2h5"] Apr 21 07:12:27.453348 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.453327 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qn9pw"] Apr 21 07:12:27.454406 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:12:27.454381 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12940af0_6363_4ae3_bd15_0431283aae9a.slice/crio-3107932c76b9412d6d1610cc9326eb76ad0af5a92781eace2d146695cb9800d1 WatchSource:0}: Error finding container 3107932c76b9412d6d1610cc9326eb76ad0af5a92781eace2d146695cb9800d1: Status 404 returned error can't find the container with id 3107932c76b9412d6d1610cc9326eb76ad0af5a92781eace2d146695cb9800d1 Apr 21 07:12:27.455449 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:12:27.455421 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod212502a2_9d42_4548_be4e_1a54064ecdf5.slice/crio-c854c10a29de2813f1a32f05f9d80ca085827e206db06b46b0242256c7cf0488 WatchSource:0}: Error finding container c854c10a29de2813f1a32f05f9d80ca085827e206db06b46b0242256c7cf0488: Status 404 returned error can't find the container with id c854c10a29de2813f1a32f05f9d80ca085827e206db06b46b0242256c7cf0488 Apr 21 07:12:27.938358 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:27.937978 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fsfmp" Apr 21 07:12:28.040158 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:28.040116 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qb2h5" event={"ID":"12940af0-6363-4ae3-bd15-0431283aae9a","Type":"ContainerStarted","Data":"3107932c76b9412d6d1610cc9326eb76ad0af5a92781eace2d146695cb9800d1"} Apr 21 07:12:28.041912 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:28.041876 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qn9pw" event={"ID":"212502a2-9d42-4548-be4e-1a54064ecdf5","Type":"ContainerStarted","Data":"c854c10a29de2813f1a32f05f9d80ca085827e206db06b46b0242256c7cf0488"} Apr 21 07:12:28.044940 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:28.044301 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" event={"ID":"770fe5c5-6bbd-4902-9a64-b38c2ad3329e","Type":"ContainerStarted","Data":"c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954"} Apr 21 07:12:28.044940 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:28.044335 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" event={"ID":"770fe5c5-6bbd-4902-9a64-b38c2ad3329e","Type":"ContainerStarted","Data":"50d54e0301bc4c1513c27698c697d88851f8787fa88b269aa09d279623cefbe2"} Apr 21 07:12:28.044940 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:28.044794 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:12:28.071691 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:28.071634 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" podStartSLOduration=97.07161193 podStartE2EDuration="1m37.07161193s" podCreationTimestamp="2026-04-21 07:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:12:28.06995988 +0000 UTC m=+98.038336118" watchObservedRunningTime="2026-04-21 07:12:28.07161193 +0000 UTC m=+98.039988169" Apr 21 07:12:30.052505 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:30.052400 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qn9pw" event={"ID":"212502a2-9d42-4548-be4e-1a54064ecdf5","Type":"ContainerStarted","Data":"ea59882f6ec20a57a9fd8d2d08e6bec4d7c5a9513380449223d1cba2b9ee3c86"} Apr 21 07:12:30.053925 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:30.053901 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qb2h5" event={"ID":"12940af0-6363-4ae3-bd15-0431283aae9a","Type":"ContainerStarted","Data":"9d164c4ae01cbc15ca2bf9c039b12b55598aadc947a3181bed0e1c216c7f111e"} Apr 21 07:12:30.054063 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:30.053931 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qb2h5" event={"ID":"12940af0-6363-4ae3-bd15-0431283aae9a","Type":"ContainerStarted","Data":"50802841beb47f86414cf4418713496111554136c16be1eb6b2a0ea8f269c8f3"} Apr 21 07:12:30.054063 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:30.054032 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qb2h5" Apr 21 07:12:30.070485 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:30.070430 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qn9pw" podStartSLOduration=66.189401823 podStartE2EDuration="1m8.070416506s" podCreationTimestamp="2026-04-21 07:11:22 +0000 UTC" firstStartedPulling="2026-04-21 07:12:27.457437079 +0000 UTC m=+97.425813293" lastFinishedPulling="2026-04-21 07:12:29.338451759 +0000 UTC m=+99.306827976" observedRunningTime="2026-04-21 07:12:30.069547924 +0000 UTC m=+100.037924154" watchObservedRunningTime="2026-04-21 07:12:30.070416506 +0000 UTC m=+100.038792773" Apr 21 07:12:30.090640 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:30.090591 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qb2h5" podStartSLOduration=66.211871745 podStartE2EDuration="1m8.090574252s" podCreationTimestamp="2026-04-21 07:11:22 +0000 UTC" firstStartedPulling="2026-04-21 07:12:27.456728372 +0000 UTC m=+97.425104587" lastFinishedPulling="2026-04-21 07:12:29.33543088 +0000 UTC m=+99.303807094" observedRunningTime="2026-04-21 07:12:30.089046039 +0000 UTC m=+100.057422276" watchObservedRunningTime="2026-04-21 07:12:30.090574252 +0000 UTC m=+100.058950488" Apr 21 07:12:31.636082 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:31.636044 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c96f69849-stwdh"] Apr 21 07:12:40.059112 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:40.059078 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qb2h5" Apr 21 07:12:51.641300 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:51.641271 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:12:56.128254 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.128215 2567 generic.go:358] "Generic (PLEG): container finished" podID="421849b1-db63-490a-b9a2-ed853fdbfbc8" containerID="5c5a34934d08ee01e9ab47dd5f6d53d4d69c5b59299444b39b2a0a086e54c62a" exitCode=0 Apr 21 07:12:56.128644 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.128273 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" event={"ID":"421849b1-db63-490a-b9a2-ed853fdbfbc8","Type":"ContainerDied","Data":"5c5a34934d08ee01e9ab47dd5f6d53d4d69c5b59299444b39b2a0a086e54c62a"} Apr 21 07:12:56.128644 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.128552 2567 scope.go:117] "RemoveContainer" containerID="5c5a34934d08ee01e9ab47dd5f6d53d4d69c5b59299444b39b2a0a086e54c62a" Apr 21 07:12:56.655256 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.655066 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" podUID="770fe5c5-6bbd-4902-9a64-b38c2ad3329e" containerName="registry" containerID="cri-o://c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954" gracePeriod=30 Apr 21 07:12:56.892454 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.892431 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:12:56.998144 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.998063 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-image-registry-private-configuration\") pod \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " Apr 21 07:12:56.998144 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.998111 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-ca-trust-extracted\") pod \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " Apr 21 07:12:56.998360 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.998174 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-bound-sa-token\") pod \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " Apr 21 07:12:56.998360 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.998209 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zsrz\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-kube-api-access-7zsrz\") pod \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " Apr 21 07:12:56.998360 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.998257 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-installation-pull-secrets\") pod \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " Apr 21 07:12:56.998360 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.998284 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-certificates\") pod \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " Apr 21 07:12:56.998360 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.998307 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") pod \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " Apr 21 07:12:56.998360 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.998339 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-trusted-ca\") pod \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\" (UID: \"770fe5c5-6bbd-4902-9a64-b38c2ad3329e\") " Apr 21 07:12:56.998950 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.998923 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "770fe5c5-6bbd-4902-9a64-b38c2ad3329e" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:12:56.999056 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.999029 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "770fe5c5-6bbd-4902-9a64-b38c2ad3329e" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:12:56.999119 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:56.999045 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-trusted-ca\") on node \"ip-10-0-131-184.ec2.internal\" DevicePath \"\"" Apr 21 07:12:57.000752 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.000722 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "770fe5c5-6bbd-4902-9a64-b38c2ad3329e" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:12:57.000908 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.000863 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-kube-api-access-7zsrz" (OuterVolumeSpecName: "kube-api-access-7zsrz") pod "770fe5c5-6bbd-4902-9a64-b38c2ad3329e" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e"). InnerVolumeSpecName "kube-api-access-7zsrz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:12:57.000908 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.000875 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "770fe5c5-6bbd-4902-9a64-b38c2ad3329e" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:12:57.001002 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.000954 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "770fe5c5-6bbd-4902-9a64-b38c2ad3329e" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:12:57.001002 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.000981 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "770fe5c5-6bbd-4902-9a64-b38c2ad3329e" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:12:57.006518 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.006497 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "770fe5c5-6bbd-4902-9a64-b38c2ad3329e" (UID: "770fe5c5-6bbd-4902-9a64-b38c2ad3329e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:12:57.100421 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.100387 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-installation-pull-secrets\") on node \"ip-10-0-131-184.ec2.internal\" DevicePath \"\"" Apr 21 07:12:57.100421 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.100415 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-certificates\") on node \"ip-10-0-131-184.ec2.internal\" DevicePath \"\"" Apr 21 07:12:57.100632 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.100430 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-registry-tls\") on node \"ip-10-0-131-184.ec2.internal\" DevicePath \"\"" Apr 21 07:12:57.100632 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.100443 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-image-registry-private-configuration\") on node \"ip-10-0-131-184.ec2.internal\" DevicePath \"\"" Apr 21 07:12:57.100632 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.100456 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-ca-trust-extracted\") on node \"ip-10-0-131-184.ec2.internal\" DevicePath \"\"" Apr 21 07:12:57.100632 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.100469 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-bound-sa-token\") on node \"ip-10-0-131-184.ec2.internal\" DevicePath \"\"" Apr 21 07:12:57.100632 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.100481 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zsrz\" (UniqueName: \"kubernetes.io/projected/770fe5c5-6bbd-4902-9a64-b38c2ad3329e-kube-api-access-7zsrz\") on node \"ip-10-0-131-184.ec2.internal\" DevicePath \"\"" Apr 21 07:12:57.133107 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.133077 2567 generic.go:358] "Generic (PLEG): container finished" podID="770fe5c5-6bbd-4902-9a64-b38c2ad3329e" containerID="c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954" exitCode=0 Apr 21 07:12:57.133480 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.133146 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" Apr 21 07:12:57.133480 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.133167 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" event={"ID":"770fe5c5-6bbd-4902-9a64-b38c2ad3329e","Type":"ContainerDied","Data":"c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954"} Apr 21 07:12:57.133480 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.133211 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c96f69849-stwdh" event={"ID":"770fe5c5-6bbd-4902-9a64-b38c2ad3329e","Type":"ContainerDied","Data":"50d54e0301bc4c1513c27698c697d88851f8787fa88b269aa09d279623cefbe2"} Apr 21 07:12:57.133480 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.133235 2567 scope.go:117] "RemoveContainer" containerID="c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954" Apr 21 07:12:57.134918 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.134898 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-clcxb" event={"ID":"421849b1-db63-490a-b9a2-ed853fdbfbc8","Type":"ContainerStarted","Data":"ee9436c0dbbb77e5e9d9f9f864b96aeacbe7d4f2febebb611742e669b09cd612"} Apr 21 07:12:57.141427 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.141410 2567 scope.go:117] "RemoveContainer" containerID="c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954" Apr 21 07:12:57.141699 ip-10-0-131-184 kubenswrapper[2567]: E0421 07:12:57.141681 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954\": container with ID starting with c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954 not found: ID does not exist" containerID="c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954" Apr 21 07:12:57.141753 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.141705 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954"} err="failed to get container status \"c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954\": rpc error: code = NotFound desc = could not find container \"c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954\": container with ID starting with c087b1736e4eebb070d93b6f58a291427770a333739ef3af7dd9a5d82ffbf954 not found: ID does not exist" Apr 21 07:12:57.168514 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.168490 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c96f69849-stwdh"] Apr 21 07:12:57.174914 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:57.174890 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5c96f69849-stwdh"] Apr 21 07:12:58.635055 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:12:58.635010 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="770fe5c5-6bbd-4902-9a64-b38c2ad3329e" path="/var/lib/kubelet/pods/770fe5c5-6bbd-4902-9a64-b38c2ad3329e/volumes" Apr 21 07:13:00.325078 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:00.325043 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:13:00.327211 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:00.327183 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d1ac31b-8866-4817-8119-87e810a0da44-metrics-certs\") pod \"network-metrics-daemon-xxrlv\" (UID: \"6d1ac31b-8866-4817-8119-87e810a0da44\") " pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:13:00.361760 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:00.361732 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-92shx\"" Apr 21 07:13:00.369651 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:00.369631 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xxrlv" Apr 21 07:13:00.487713 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:00.487690 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xxrlv"] Apr 21 07:13:00.490209 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:13:00.490169 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1ac31b_8866_4817_8119_87e810a0da44.slice/crio-d95112a3491ad0a346df6e5c2db9c7383a0ee3a03e1b370048c207c00e446317 WatchSource:0}: Error finding container d95112a3491ad0a346df6e5c2db9c7383a0ee3a03e1b370048c207c00e446317: Status 404 returned error can't find the container with id d95112a3491ad0a346df6e5c2db9c7383a0ee3a03e1b370048c207c00e446317 Apr 21 07:13:01.148375 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:01.148341 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xxrlv" event={"ID":"6d1ac31b-8866-4817-8119-87e810a0da44","Type":"ContainerStarted","Data":"d95112a3491ad0a346df6e5c2db9c7383a0ee3a03e1b370048c207c00e446317"} Apr 21 07:13:02.152667 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:02.152630 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xxrlv" event={"ID":"6d1ac31b-8866-4817-8119-87e810a0da44","Type":"ContainerStarted","Data":"c84778a7eb3f5c060b001eab07fd918879d07961b2b95b7ce3ee0022607e96aa"} Apr 21 07:13:02.152667 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:02.152671 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xxrlv" event={"ID":"6d1ac31b-8866-4817-8119-87e810a0da44","Type":"ContainerStarted","Data":"5515404b0e77f28c2706ca365657e6cdb0f09bddbdba81fd465009174f804e53"} Apr 21 07:13:02.172036 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:02.171991 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xxrlv" podStartSLOduration=131.161713619 podStartE2EDuration="2m12.171977954s" podCreationTimestamp="2026-04-21 07:10:50 +0000 UTC" firstStartedPulling="2026-04-21 07:13:00.492200682 +0000 UTC m=+130.460576897" lastFinishedPulling="2026-04-21 07:13:01.502465009 +0000 UTC m=+131.470841232" observedRunningTime="2026-04-21 07:13:02.171450646 +0000 UTC m=+132.139826882" watchObservedRunningTime="2026-04-21 07:13:02.171977954 +0000 UTC m=+132.140354190" Apr 21 07:13:03.334718 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:03.334661 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" podUID="d605a4c8-bdf6-482a-9491-bc1262224419" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 07:13:05.162291 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:05.162261 2567 generic.go:358] "Generic (PLEG): container finished" podID="fe96b4e1-8214-4d66-86fa-ea8e6ddd030c" containerID="84afa20e9ce1632a8f75d01f66f9fc6849afe24e302634dd78a71e565698986e" exitCode=0 Apr 21 07:13:05.162587 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:05.162330 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" event={"ID":"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c","Type":"ContainerDied","Data":"84afa20e9ce1632a8f75d01f66f9fc6849afe24e302634dd78a71e565698986e"} Apr 21 07:13:05.162697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:05.162682 2567 scope.go:117] "RemoveContainer" containerID="84afa20e9ce1632a8f75d01f66f9fc6849afe24e302634dd78a71e565698986e" Apr 21 07:13:06.166579 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:06.166539 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lv6ts" event={"ID":"fe96b4e1-8214-4d66-86fa-ea8e6ddd030c","Type":"ContainerStarted","Data":"5515c37a93a9e194f08e9f8b8449a26914cbb0f9421c2cd9face947c17e0daeb"} Apr 21 07:13:06.172308 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:06.172277 2567 generic.go:358] "Generic (PLEG): container finished" podID="aef3d006-e32d-47ee-b413-d3d17fa20b47" containerID="2e7d0381479a28e25e2a3f0c9ae08947ed17132c77652cb8e62f8e6809c9144b" exitCode=0 Apr 21 07:13:06.172420 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:06.172329 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5nqns" event={"ID":"aef3d006-e32d-47ee-b413-d3d17fa20b47","Type":"ContainerDied","Data":"2e7d0381479a28e25e2a3f0c9ae08947ed17132c77652cb8e62f8e6809c9144b"} Apr 21 07:13:06.172685 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:06.172667 2567 scope.go:117] "RemoveContainer" containerID="2e7d0381479a28e25e2a3f0c9ae08947ed17132c77652cb8e62f8e6809c9144b" Apr 21 07:13:07.176994 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:07.176957 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5nqns" event={"ID":"aef3d006-e32d-47ee-b413-d3d17fa20b47","Type":"ContainerStarted","Data":"89447aa3bff4c3914dc68a11b938b0aea0e738884a251c9cf5608c77e36082df"} Apr 21 07:13:13.334772 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:13.334730 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" podUID="d605a4c8-bdf6-482a-9491-bc1262224419" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 07:13:23.334136 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:23.334096 2567 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" podUID="d605a4c8-bdf6-482a-9491-bc1262224419" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 07:13:23.334634 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:23.334163 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" Apr 21 07:13:23.334708 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:23.334638 2567 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"5c8ed5348a018dd92808ca0d052ec3a1a5c0bda69cceddb03db74dbe9891d586"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 07:13:23.334708 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:23.334677 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" podUID="d605a4c8-bdf6-482a-9491-bc1262224419" containerName="service-proxy" containerID="cri-o://5c8ed5348a018dd92808ca0d052ec3a1a5c0bda69cceddb03db74dbe9891d586" gracePeriod=30 Apr 21 07:13:24.230039 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:24.230002 2567 generic.go:358] "Generic (PLEG): container finished" podID="d605a4c8-bdf6-482a-9491-bc1262224419" containerID="5c8ed5348a018dd92808ca0d052ec3a1a5c0bda69cceddb03db74dbe9891d586" exitCode=2 Apr 21 07:13:24.230210 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:24.230065 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" event={"ID":"d605a4c8-bdf6-482a-9491-bc1262224419","Type":"ContainerDied","Data":"5c8ed5348a018dd92808ca0d052ec3a1a5c0bda69cceddb03db74dbe9891d586"} Apr 21 07:13:24.230210 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:13:24.230093 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5ddb8d5989-sz45f" event={"ID":"d605a4c8-bdf6-482a-9491-bc1262224419","Type":"ContainerStarted","Data":"e19dfc937293696e58563c49f96baad47efc25cfbf4de7b08d7c9987eb47a1be"} Apr 21 07:15:22.735187 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:22.735114 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6h4bv_a531b156-35af-430e-b636-9146320cb9f5/global-pull-secret-syncer/0.log" Apr 21 07:15:22.927962 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:22.927927 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-v524f_476caf85-7f49-4bd9-944d-7dd2e7975a87/konnectivity-agent/0.log" Apr 21 07:15:22.987322 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:22.987240 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-184.ec2.internal_3b2b2976bec5eff564002b454ef52b93/haproxy/0.log" Apr 21 07:15:26.222915 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:26.222884 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-8c2r2_6711400f-c84e-4ed4-a4b1-515e9d0818a7/cluster-monitoring-operator/0.log" Apr 21 07:15:26.392792 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:26.392748 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bw5vh_6eb95f88-f13e-40eb-96e2-e5af7eff2dc9/node-exporter/0.log" Apr 21 07:15:26.419138 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:26.419111 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bw5vh_6eb95f88-f13e-40eb-96e2-e5af7eff2dc9/kube-rbac-proxy/0.log" Apr 21 07:15:26.445073 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:26.445043 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bw5vh_6eb95f88-f13e-40eb-96e2-e5af7eff2dc9/init-textfile/0.log" Apr 21 07:15:28.217744 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.217719 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-ngssz_b70e7d80-e8c8-44d5-8f22-7d192e037f9f/networking-console-plugin/0.log" Apr 21 07:15:28.604629 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.604604 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ckzx_f2ec0ae8-afa6-40cd-943d-465f66eaed59/console-operator/1.log" Apr 21 07:15:28.608967 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.608946 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8ckzx_f2ec0ae8-afa6-40cd-943d-465f66eaed59/console-operator/2.log" Apr 21 07:15:28.889706 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.889618 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl"] Apr 21 07:15:28.889956 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.889941 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="770fe5c5-6bbd-4902-9a64-b38c2ad3329e" containerName="registry" Apr 21 07:15:28.890001 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.889958 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="770fe5c5-6bbd-4902-9a64-b38c2ad3329e" containerName="registry" Apr 21 07:15:28.890034 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.890009 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="770fe5c5-6bbd-4902-9a64-b38c2ad3329e" containerName="registry" Apr 21 07:15:28.892970 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.892950 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:28.895840 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.895820 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nhprw\"/\"default-dockercfg-p9c5q\"" Apr 21 07:15:28.896090 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.896061 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nhprw\"/\"openshift-service-ca.crt\"" Apr 21 07:15:28.896194 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.896123 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nhprw\"/\"kube-root-ca.crt\"" Apr 21 07:15:28.902768 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:28.902747 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl"] Apr 21 07:15:29.045658 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.045620 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbhkr\" (UniqueName: \"kubernetes.io/projected/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-kube-api-access-cbhkr\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.045861 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.045663 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-lib-modules\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.045861 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.045819 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-sys\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.045991 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.045873 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-podres\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.045991 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.045922 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-proc\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.147404 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.147303 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-podres\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.147404 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.147366 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-proc\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.147404 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.147396 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbhkr\" (UniqueName: \"kubernetes.io/projected/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-kube-api-access-cbhkr\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.147673 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.147419 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-lib-modules\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.147673 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.147463 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-sys\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.147673 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.147498 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-podres\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.147673 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.147498 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-proc\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.147673 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.147564 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-sys\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.147673 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.147629 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-lib-modules\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.155880 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.155847 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbhkr\" (UniqueName: \"kubernetes.io/projected/7928a711-b23b-4b36-a0cd-fda51b2e4b2c-kube-api-access-cbhkr\") pod \"perf-node-gather-daemonset-fxzdl\" (UID: \"7928a711-b23b-4b36-a0cd-fda51b2e4b2c\") " pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.202516 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.202480 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.321785 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.321757 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl"] Apr 21 07:15:29.324182 ip-10-0-131-184 kubenswrapper[2567]: W0421 07:15:29.324152 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7928a711_b23b_4b36_a0cd_fda51b2e4b2c.slice/crio-6e74ccb7f9d8b30c5d5b1cbc12a0007d2c570a64ee4ad322a9985c75f2bac672 WatchSource:0}: Error finding container 6e74ccb7f9d8b30c5d5b1cbc12a0007d2c570a64ee4ad322a9985c75f2bac672: Status 404 returned error can't find the container with id 6e74ccb7f9d8b30c5d5b1cbc12a0007d2c570a64ee4ad322a9985c75f2bac672 Apr 21 07:15:29.448745 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.448722 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-l47f4_01991e8c-8e7b-4dbb-97df-6a5c1999f0eb/volume-data-source-validator/0.log" Apr 21 07:15:29.569263 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.569224 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" event={"ID":"7928a711-b23b-4b36-a0cd-fda51b2e4b2c","Type":"ContainerStarted","Data":"5cfe9e87eeb60b8bd284db8b6217a0f6196c700d923d11d0c10675f0a5a91f48"} Apr 21 07:15:29.569434 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.569271 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" event={"ID":"7928a711-b23b-4b36-a0cd-fda51b2e4b2c","Type":"ContainerStarted","Data":"6e74ccb7f9d8b30c5d5b1cbc12a0007d2c570a64ee4ad322a9985c75f2bac672"} Apr 21 07:15:29.569434 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.569304 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:29.589321 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:29.589273 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" podStartSLOduration=1.589257776 podStartE2EDuration="1.589257776s" podCreationTimestamp="2026-04-21 07:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:15:29.587416737 +0000 UTC m=+279.555792972" watchObservedRunningTime="2026-04-21 07:15:29.589257776 +0000 UTC m=+279.557634012" Apr 21 07:15:30.260434 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:30.260401 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qb2h5_12940af0-6363-4ae3-bd15-0431283aae9a/dns/0.log" Apr 21 07:15:30.285922 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:30.285897 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qb2h5_12940af0-6363-4ae3-bd15-0431283aae9a/kube-rbac-proxy/0.log" Apr 21 07:15:30.362959 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:30.362919 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qzpkj_64fd83cc-7ef6-4cb9-892b-0111cac9771d/dns-node-resolver/0.log" Apr 21 07:15:30.778146 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:30.778119 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7vkh9_8105a8f5-e174-49e3-ba2e-c9e8b7d649a4/node-ca/0.log" Apr 21 07:15:31.539653 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:31.539623 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6965f7486b-s6vrq_d3bff263-210d-4a8a-baab-642a7254c4f8/router/0.log" Apr 21 07:15:31.882317 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:31.882283 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qn9pw_212502a2-9d42-4548-be4e-1a54064ecdf5/serve-healthcheck-canary/0.log" Apr 21 07:15:32.193428 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:32.193348 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5nqns_aef3d006-e32d-47ee-b413-d3d17fa20b47/insights-operator/1.log" Apr 21 07:15:32.199052 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:32.199029 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5nqns_aef3d006-e32d-47ee-b413-d3d17fa20b47/insights-operator/0.log" Apr 21 07:15:32.362967 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:32.362931 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xcdds_dd692286-feae-4dbc-8b70-cb2a424e2cec/kube-rbac-proxy/0.log" Apr 21 07:15:32.382548 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:32.382496 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xcdds_dd692286-feae-4dbc-8b70-cb2a424e2cec/exporter/0.log" Apr 21 07:15:32.403412 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:32.403383 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xcdds_dd692286-feae-4dbc-8b70-cb2a424e2cec/extractor/0.log" Apr 21 07:15:35.582983 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:35.582957 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nhprw/perf-node-gather-daemonset-fxzdl" Apr 21 07:15:36.496589 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:36.496550 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-lv6ts_fe96b4e1-8214-4d66-86fa-ea8e6ddd030c/kube-storage-version-migrator-operator/1.log" Apr 21 07:15:36.497366 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:36.497346 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-lv6ts_fe96b4e1-8214-4d66-86fa-ea8e6ddd030c/kube-storage-version-migrator-operator/0.log" Apr 21 07:15:37.386507 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:37.386476 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9t6vk_ca549d86-e91c-4488-bfac-cf936e205050/kube-multus-additional-cni-plugins/0.log" Apr 21 07:15:37.409142 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:37.409113 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9t6vk_ca549d86-e91c-4488-bfac-cf936e205050/egress-router-binary-copy/0.log" Apr 21 07:15:37.437504 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:37.437478 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9t6vk_ca549d86-e91c-4488-bfac-cf936e205050/cni-plugins/0.log" Apr 21 07:15:37.463766 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:37.463748 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9t6vk_ca549d86-e91c-4488-bfac-cf936e205050/bond-cni-plugin/0.log" Apr 21 07:15:37.496295 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:37.496278 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9t6vk_ca549d86-e91c-4488-bfac-cf936e205050/routeoverride-cni/0.log" Apr 21 07:15:37.524959 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:37.524935 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9t6vk_ca549d86-e91c-4488-bfac-cf936e205050/whereabouts-cni-bincopy/0.log" Apr 21 07:15:37.546639 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:37.546622 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9t6vk_ca549d86-e91c-4488-bfac-cf936e205050/whereabouts-cni/0.log" Apr 21 07:15:37.980007 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:37.979975 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bs4gw_918e28f2-6377-405c-885f-92621fe803a0/kube-multus/0.log" Apr 21 07:15:38.195328 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:38.195249 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xxrlv_6d1ac31b-8866-4817-8119-87e810a0da44/network-metrics-daemon/0.log" Apr 21 07:15:38.225868 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:38.225843 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xxrlv_6d1ac31b-8866-4817-8119-87e810a0da44/kube-rbac-proxy/0.log" Apr 21 07:15:39.187358 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:39.187288 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbdvz_c395523b-6f94-447f-a14f-b3e86618c396/ovn-controller/0.log" Apr 21 07:15:39.209391 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:39.209361 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbdvz_c395523b-6f94-447f-a14f-b3e86618c396/ovn-acl-logging/0.log" Apr 21 07:15:39.243089 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:39.243067 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbdvz_c395523b-6f94-447f-a14f-b3e86618c396/kube-rbac-proxy-node/0.log" Apr 21 07:15:39.268155 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:39.268128 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbdvz_c395523b-6f94-447f-a14f-b3e86618c396/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 07:15:39.291192 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:39.291159 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbdvz_c395523b-6f94-447f-a14f-b3e86618c396/northd/0.log" Apr 21 07:15:39.319886 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:39.319864 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbdvz_c395523b-6f94-447f-a14f-b3e86618c396/nbdb/0.log" Apr 21 07:15:39.345423 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:39.345398 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbdvz_c395523b-6f94-447f-a14f-b3e86618c396/sbdb/0.log" Apr 21 07:15:39.438640 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:39.438581 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pbdvz_c395523b-6f94-447f-a14f-b3e86618c396/ovnkube-controller/0.log" Apr 21 07:15:40.729819 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:40.729787 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-wjjff_abcc0250-ed3e-47e6-8e0d-cd093fc5184c/check-endpoints/0.log" Apr 21 07:15:40.755697 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:40.755656 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fsfmp_39152450-b5d7-466f-b0a7-58dad042db38/network-check-target-container/0.log" Apr 21 07:15:41.691542 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:41.691489 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5hf89_8352266b-7f87-4b49-9222-1a7518a8bda8/iptables-alerter/0.log" Apr 21 07:15:42.379883 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:42.379856 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vxq2w_7c0141af-1317-4665-bd56-7841a1731312/tuned/0.log" Apr 21 07:15:43.956866 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:43.956832 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-fz65p_68739b76-259c-44cc-ae50-18c754490061/cluster-samples-operator/0.log" Apr 21 07:15:43.975461 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:43.975441 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-fz65p_68739b76-259c-44cc-ae50-18c754490061/cluster-samples-operator-watch/0.log" Apr 21 07:15:44.871109 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:44.871079 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-clcxb_421849b1-db63-490a-b9a2-ed853fdbfbc8/service-ca-operator/1.log" Apr 21 07:15:44.871909 ip-10-0-131-184 kubenswrapper[2567]: I0421 07:15:44.871892 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-clcxb_421849b1-db63-490a-b9a2-ed853fdbfbc8/service-ca-operator/0.log"