Apr 22 17:50:32.276110 ip-10-0-135-143 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 17:50:32.276126 ip-10-0-135-143 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 17:50:32.276135 ip-10-0-135-143 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 17:50:32.276456 ip-10-0-135-143 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 17:50:42.431650 ip-10-0-135-143 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 17:50:42.431670 ip-10-0-135-143 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 9314aa51e8f041738f39f4c5406c417e -- Apr 22 17:52:57.778623 ip-10-0-135-143 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:52:58.158634 ip-10-0-135-143 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:52:58.158634 ip-10-0-135-143 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:52:58.158634 ip-10-0-135-143 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:52:58.158634 ip-10-0-135-143 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:52:58.158634 ip-10-0-135-143 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:52:58.160059 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.159971 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:52:58.164032 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164016 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:58.164032 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164031 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164035 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164038 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164041 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164044 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164047 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164049 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164052 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164054 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164057 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164059 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164062 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164065 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164067 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164070 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164080 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164083 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164085 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164088 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:58.164093 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164091 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164093 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164096 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164099 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164102 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164105 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164108 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164110 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164113 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164115 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164118 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164120 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164123 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164126 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164128 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164131 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164134 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164136 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164139 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164141 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:58.164585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164144 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164146 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164149 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164152 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164157 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164160 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164164 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164167 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164170 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164173 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164175 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164178 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164181 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164184 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164188 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164191 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164194 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164196 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164199 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:58.165067 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164202 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164204 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164207 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164210 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164214 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164218 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164221 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164223 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164226 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164229 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164231 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164234 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164236 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164239 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164241 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164244 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164247 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164250 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164252 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164255 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:58.165542 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164257 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164262 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164266 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164269 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164272 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164275 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164278 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164693 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164699 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164702 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164705 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164708 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164710 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164713 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164715 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164718 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164720 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164723 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164726 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164729 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:58.166016 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164733 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164736 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164739 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164741 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164744 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164746 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164749 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164751 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164754 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164757 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164759 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164762 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164764 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164767 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164769 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164771 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164775 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164777 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164780 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:58.166509 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164783 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164786 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164788 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164791 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164793 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164796 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164799 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164802 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164804 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164807 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164809 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164812 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164814 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164817 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164819 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164821 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164824 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164827 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164829 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:58.167010 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164832 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164834 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164837 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164840 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164843 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164845 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164848 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164850 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164853 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164856 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164858 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164861 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164863 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164866 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164869 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164871 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164873 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164876 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164878 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164881 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:58.167475 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164883 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164886 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164888 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164891 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164893 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164896 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164898 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164900 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164903 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164905 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164908 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164910 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164915 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164919 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.164922 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165444 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165455 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165462 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165466 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165471 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165474 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:52:58.168075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165480 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165484 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165488 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165491 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165495 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165498 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165501 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165504 2574 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165507 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165510 2574 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165513 2574 flags.go:64] FLAG: --cloud-config="" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165516 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165519 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165523 2574 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165526 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165529 2574 flags.go:64] FLAG: --config-dir="" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165532 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165536 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165540 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165543 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165546 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165549 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165553 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165556 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:52:58.168618 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165559 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165562 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165565 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165570 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165573 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165576 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165579 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165582 2574 flags.go:64] FLAG: --enable-server="true" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165585 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165590 2574 flags.go:64] FLAG: --event-burst="100" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165594 2574 flags.go:64] FLAG: --event-qps="50" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165597 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165601 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165604 2574 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165608 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165611 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165614 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165617 2574 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165620 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165623 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165626 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165629 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165632 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165635 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165638 2574 flags.go:64] FLAG: --feature-gates="" Apr 22 17:52:58.169194 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165642 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165645 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165648 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165651 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165654 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165657 2574 flags.go:64] FLAG: --help="false" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165660 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165663 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165666 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165669 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165672 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165675 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165679 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165682 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165684 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165687 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165690 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165693 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165696 2574 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165699 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165702 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165705 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165708 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165711 2574 flags.go:64] FLAG: --lock-file="" Apr 22 17:52:58.169814 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165714 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165717 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165720 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165725 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165728 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165731 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165734 2574 flags.go:64] FLAG: --logging-format="text" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165736 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165739 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165742 2574 flags.go:64] FLAG: --manifest-url="" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165745 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165749 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165753 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165757 2574 flags.go:64] FLAG: --max-pods="110" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165760 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165763 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165766 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165769 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165772 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165775 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165778 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165787 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165790 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165793 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:52:58.170406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165796 2574 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165798 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165805 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165807 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165810 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165813 2574 flags.go:64] FLAG: --port="10250" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165816 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165819 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cee82c82366dc0fc" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165822 2574 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165825 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165828 2574 flags.go:64] FLAG: --register-node="true" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165831 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165834 2574 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165837 2574 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165840 2574 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165843 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165846 2574 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165850 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165852 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165855 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165858 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165865 2574 flags.go:64] FLAG: --runonce="false" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165868 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165871 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165874 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:52:58.170979 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165877 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165880 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165883 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165887 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165890 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165893 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165896 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165899 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165902 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165905 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165908 2574 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165911 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165916 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165919 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165922 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165926 2574 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165929 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165932 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165935 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165937 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165941 2574 flags.go:64] FLAG: --v="2" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165945 2574 flags.go:64] FLAG: --version="false" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165949 2574 flags.go:64] FLAG: --vmodule="" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165953 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.165956 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:52:58.171579 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166055 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166059 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166064 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166067 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166070 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166073 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166076 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166078 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166081 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166084 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166087 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166089 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166092 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166095 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166098 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166100 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166103 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166105 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166108 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166110 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:58.172178 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166113 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166115 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166119 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166121 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166124 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166126 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166129 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166131 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166134 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166136 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166139 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166142 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166144 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166147 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166151 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166154 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166159 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166162 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166165 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:58.172730 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166167 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166170 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166173 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166176 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166178 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166181 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166184 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166187 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166189 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166192 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166194 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166197 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166199 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166202 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166204 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166207 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166210 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166212 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166215 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166217 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:58.173195 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166220 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166222 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166225 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166227 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166230 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166232 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166235 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166240 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166242 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166244 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166247 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166250 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166252 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166255 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166257 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166260 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166262 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166266 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166270 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166273 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:58.173699 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166277 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166280 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166282 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166285 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166288 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166291 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.166294 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.166805 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.173629 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.173644 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173691 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173696 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173700 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173704 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173707 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173711 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:58.174199 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173714 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173717 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173720 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173723 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173726 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173728 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173731 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173733 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173736 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173739 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173741 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173744 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173746 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173749 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173751 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173754 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173763 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173766 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173768 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173771 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:58.174611 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173773 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173776 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173778 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173782 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173786 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173791 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173793 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173796 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173799 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173802 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173805 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173807 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173810 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173813 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173815 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173818 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173820 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173822 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173825 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:58.175102 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173827 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173830 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173832 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173834 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173837 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173839 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173842 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173844 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173847 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173849 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173859 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173861 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173864 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173866 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173869 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173871 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173874 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173876 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173879 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173882 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:58.175570 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173885 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173887 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173890 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173892 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173895 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173897 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173900 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173902 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173905 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173907 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173910 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173912 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173914 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173917 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173919 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173923 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173927 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173929 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173932 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:58.176109 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173934 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.173937 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.173942 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174049 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174055 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174058 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174061 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174064 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174067 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174069 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174072 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174075 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174077 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174080 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174082 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:58.176585 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174085 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174087 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174090 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174092 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174095 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174097 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174100 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174102 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174105 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174107 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174110 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174112 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174115 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174117 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174120 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174122 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174124 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174127 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174129 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174131 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:58.176943 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174134 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174136 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174140 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174143 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174145 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174148 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174150 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174153 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174155 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174158 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174160 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174163 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174165 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174167 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174170 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174172 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174175 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174177 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174180 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174182 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:58.177441 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174185 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174187 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174191 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174194 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174197 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174200 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174203 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174206 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174208 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174211 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174214 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174217 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174220 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174222 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174225 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174229 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174232 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174235 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174238 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:58.177920 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174241 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174243 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174246 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174248 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174250 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174253 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174256 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174258 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174260 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174263 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174266 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174268 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174270 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174273 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:58.174275 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:58.178498 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.174280 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:52:58.178872 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.174923 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:52:58.179374 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.179360 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:52:58.180275 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.180263 2574 server.go:1019] "Starting client certificate rotation" Apr 22 17:52:58.180378 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.180364 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:52:58.180413 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.180396 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:52:58.204911 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.204894 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:52:58.207225 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.207204 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:52:58.222288 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.222270 2574 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:52:58.226994 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.226974 2574 log.go:25] "Validated CRI v1 image API" Apr 22 17:52:58.228242 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.228227 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:52:58.231631 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.231604 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 da5d3741-97a0-45ad-837d-732ca4507625:/dev/nvme0n1p3 ec4e0d4b-6e1f-488a-b7fa-d246ea4cd5ff:/dev/nvme0n1p4] Apr 22 17:52:58.231707 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.231628 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:52:58.234392 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.234372 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:52:58.236609 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.236497 2574 manager.go:217] Machine: {Timestamp:2026-04-22 17:52:58.235466149 +0000 UTC m=+0.357576315 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100265 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2cfcba0792d1f163bcf9daf6fd0dba SystemUUID:ec2cfcba-0792-d1f1-63bc-f9daf6fd0dba BootID:9314aa51-e8f0-4173-8f39-f4c5406c417e Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ba:4a:d9:15:51 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ba:4a:d9:15:51 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:db:85:98:b3:1a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:52:58.236609 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.236598 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:52:58.236775 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.236703 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:52:58.237713 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.237683 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:52:58.237882 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.237715 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-143.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:52:58.238011 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.237896 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:52:58.238011 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.237909 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:52:58.238011 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.237927 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:52:58.238615 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.238601 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:52:58.240104 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.240092 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:52:58.240235 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.240223 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:52:58.242212 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.242200 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:52:58.242270 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.242217 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:52:58.242270 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.242233 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:52:58.242270 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.242246 2574 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:52:58.242270 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.242259 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:52:58.243203 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.243189 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:52:58.243271 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.243213 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:52:58.245849 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.245834 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:52:58.247396 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.247384 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:52:58.248550 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248537 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248554 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248567 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248575 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248581 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248586 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248593 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248598 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248606 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248612 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:52:58.248622 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248626 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:52:58.248884 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248635 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:52:58.248884 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248670 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:52:58.248884 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.248676 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:52:58.252182 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.252169 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:52:58.252233 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.252203 2574 server.go:1295] "Started kubelet" Apr 22 17:52:58.252346 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.252303 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:52:58.252416 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.252289 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:52:58.252455 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.252445 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:52:58.253013 ip-10-0-135-143 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:52:58.256819 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.256796 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:52:58.257226 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.257160 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-143.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:52:58.257444 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.257390 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:52:58.257444 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.257427 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-143.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:52:58.257867 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.257851 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:52:58.261760 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.260148 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-143.ec2.internal.18a8bf4a2a153f6f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-143.ec2.internal,UID:ip-10-0-135-143.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-143.ec2.internal,},FirstTimestamp:2026-04-22 17:52:58.252181359 +0000 UTC m=+0.374291525,LastTimestamp:2026-04-22 17:52:58.252181359 +0000 UTC m=+0.374291525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-143.ec2.internal,}" Apr 22 17:52:58.263806 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.263677 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:52:58.264086 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.264066 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:52:58.264746 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.264731 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:52:58.264746 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.264746 2574 factory.go:55] Registering systemd factory Apr 22 17:52:58.264853 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.264754 2574 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:52:58.265022 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.264997 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:58.265113 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265074 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:52:58.265113 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265074 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:52:58.265113 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265100 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:52:58.265255 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265149 2574 factory.go:153] Registering CRI-O factory Apr 22 17:52:58.265255 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265160 2574 factory.go:223] Registration of the crio container factory successfully Apr 22 17:52:58.265255 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265182 2574 factory.go:103] Registering Raw factory Apr 22 17:52:58.265255 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265198 2574 manager.go:1196] Started watching for new ooms in manager Apr 22 17:52:58.265255 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265200 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:52:58.265255 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265211 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:52:58.265255 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.265214 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:52:58.265754 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.265609 2574 manager.go:319] Starting recovery of all containers Apr 22 17:52:58.271886 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.271847 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:52:58.274055 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.274027 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-143.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 17:52:58.274170 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.274048 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:52:58.276649 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.276632 2574 manager.go:324] Recovery completed Apr 22 17:52:58.280482 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.280470 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:58.283337 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.283308 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:58.283424 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.283353 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:58.283424 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.283370 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:58.283823 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.283806 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:52:58.283823 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.283821 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:52:58.283941 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.283838 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:52:58.286031 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.286018 2574 policy_none.go:49] "None policy: Start" Apr 22 17:52:58.286075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.286036 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:52:58.286075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.286046 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:52:58.292832 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.292771 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-143.ec2.internal.18a8bf4a2bf0a0f9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-143.ec2.internal,UID:ip-10-0-135-143.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-143.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-143.ec2.internal,},FirstTimestamp:2026-04-22 17:52:58.283335929 +0000 UTC m=+0.405446097,LastTimestamp:2026-04-22 17:52:58.283335929 +0000 UTC m=+0.405446097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-143.ec2.internal,}" Apr 22 17:52:58.293477 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.293462 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jmbbk" Apr 22 17:52:58.302088 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.301943 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jmbbk" Apr 22 17:52:58.303857 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.303764 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-143.ec2.internal.18a8bf4a2bf10017 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-143.ec2.internal,UID:ip-10-0-135-143.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-135-143.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-135-143.ec2.internal,},FirstTimestamp:2026-04-22 17:52:58.283360279 +0000 UTC m=+0.405470449,LastTimestamp:2026-04-22 17:52:58.283360279 +0000 UTC m=+0.405470449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-143.ec2.internal,}" Apr 22 17:52:58.321961 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.321948 2574 manager.go:341] "Starting Device Plugin manager" Apr 22 17:52:58.335755 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.321982 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:52:58.335755 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.321992 2574 server.go:85] "Starting device plugin registration server" Apr 22 17:52:58.335755 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.322221 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:52:58.335755 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.322233 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:52:58.335755 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.322308 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:52:58.335755 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.322414 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:52:58.335755 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.322425 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:52:58.335755 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.323001 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:52:58.335755 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.323039 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:58.362745 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.362718 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:52:58.362745 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.362743 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:52:58.362881 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.362761 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:52:58.362881 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.362767 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:52:58.362881 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.362795 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:52:58.366519 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.366503 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:52:58.422794 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.422741 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:58.423821 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.423804 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:58.423914 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.423838 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:58.423914 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.423852 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:58.423914 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.423882 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.432602 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.432588 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.432667 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.432608 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-143.ec2.internal\": node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:58.444720 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.444704 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:58.463715 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.463696 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal"] Apr 22 17:52:58.463774 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.463758 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:58.464892 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.464878 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:58.464956 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.464903 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:58.464956 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.464914 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:58.466046 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.466029 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cfd22163f3b6268afb9f8cf0bead0178-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal\" (UID: \"cfd22163f3b6268afb9f8cf0bead0178\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.466091 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.466055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfd22163f3b6268afb9f8cf0bead0178-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal\" (UID: \"cfd22163f3b6268afb9f8cf0bead0178\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.466303 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.466291 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:58.466473 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.466460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.466506 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.466487 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:58.466952 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.466938 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:58.466996 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.466965 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:58.466996 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.466944 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:58.467055 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.467002 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:58.467055 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.467015 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:58.467055 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.466976 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:58.468170 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.468151 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.468228 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.468179 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:58.468977 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.468954 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:58.469070 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.468984 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:58.469070 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.468995 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:58.487901 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.487880 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-143.ec2.internal\" not found" node="ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.492046 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.492031 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-143.ec2.internal\" not found" node="ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.545787 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.545764 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:58.566418 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.566398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cfd22163f3b6268afb9f8cf0bead0178-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal\" (UID: \"cfd22163f3b6268afb9f8cf0bead0178\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.566488 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.566423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfd22163f3b6268afb9f8cf0bead0178-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal\" (UID: \"cfd22163f3b6268afb9f8cf0bead0178\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.566488 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.566443 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5b16281a969b22277334b5b3f7efb159-config\") pod \"kube-apiserver-proxy-ip-10-0-135-143.ec2.internal\" (UID: \"5b16281a969b22277334b5b3f7efb159\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.566552 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.566490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfd22163f3b6268afb9f8cf0bead0178-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal\" (UID: \"cfd22163f3b6268afb9f8cf0bead0178\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.566552 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.566495 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cfd22163f3b6268afb9f8cf0bead0178-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal\" (UID: \"cfd22163f3b6268afb9f8cf0bead0178\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.646015 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.645994 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:58.667332 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.667313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5b16281a969b22277334b5b3f7efb159-config\") pod \"kube-apiserver-proxy-ip-10-0-135-143.ec2.internal\" (UID: \"5b16281a969b22277334b5b3f7efb159\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.667397 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.667380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5b16281a969b22277334b5b3f7efb159-config\") pod \"kube-apiserver-proxy-ip-10-0-135-143.ec2.internal\" (UID: \"5b16281a969b22277334b5b3f7efb159\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.746782 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.746728 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:58.790281 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.790252 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.794708 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:58.794691 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal" Apr 22 17:52:58.847221 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.847197 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:58.947706 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:58.947683 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:59.048272 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.048209 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-143.ec2.internal\" not found" Apr 22 17:52:59.137336 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.137298 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:52:59.165311 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.165291 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" Apr 22 17:52:59.178939 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.178910 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:52:59.180420 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.180403 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:52:59.180543 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.180524 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:52:59.180589 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.180528 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:52:59.180824 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.180809 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal" Apr 22 17:52:59.196807 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.196788 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:52:59.243040 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.243005 2574 apiserver.go:52] "Watching apiserver" Apr 22 17:52:59.251480 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.251457 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:52:59.252945 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.252924 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal","openshift-multus/multus-555gb","openshift-multus/multus-additional-cni-plugins-v2plq","openshift-network-diagnostics/network-check-target-sbf9k","openshift-network-operator/iptables-alerter-rlmgg","openshift-ovn-kubernetes/ovnkube-node-qv8jk","kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk","openshift-cluster-node-tuning-operator/tuned-gtxxh","openshift-multus/network-metrics-daemon-hgdxx","kube-system/konnectivity-agent-mp6ht","openshift-image-registry/node-ca-wl6lv"] Apr 22 17:52:59.255054 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.255032 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-555gb" Apr 22 17:52:59.257532 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.257500 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.258593 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.258575 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:52:59.258695 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.258640 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:52:59.258695 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.258649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.259752 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.259694 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:52:59.259752 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.259719 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:52:59.259907 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.259753 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:52:59.259907 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.259726 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-2frrq\"" Apr 22 17:52:59.260016 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.259975 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:52:59.260202 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.260182 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:52:59.260615 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.260593 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:52:59.260711 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.260663 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-h2rxh\"" Apr 22 17:52:59.260994 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.260977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.261097 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.261049 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.261965 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.261946 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9ml8k\"" Apr 22 17:52:59.262179 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.262161 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:52:59.262263 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.262241 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.262983 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.262968 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:52:59.263423 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.263407 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:52:59.263489 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.263470 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:52:59.263920 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.263906 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:52:59.264043 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.264024 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:52:59.264246 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.264081 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:52:59.264246 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.264197 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:52:59.264408 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.264344 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:52:59.264724 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.264710 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:52:59.265459 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.265438 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:52:59.265459 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.265450 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t78px\"" Apr 22 17:52:59.265654 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.265562 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:52:59.265654 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.265631 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:52:59.265654 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.265632 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:52:59.265860 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.265711 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:52:59.265860 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.265824 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jnfvt\"" Apr 22 17:52:59.265860 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.265831 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:52:59.266028 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.266013 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:52:59.266098 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.266020 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-cxxtn\"" Apr 22 17:52:59.266173 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.266147 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.266580 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.266567 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:52:59.268073 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.268058 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:52:59.268138 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.268091 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-v965b\"" Apr 22 17:52:59.269885 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.269866 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:52:59.269984 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.269934 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:52:59.270341 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270315 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5mp6q\"" Apr 22 17:52:59.270401 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270320 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:52:59.270508 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-env-overrides\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.270546 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270517 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-ovnkube-script-lib\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.270546 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270533 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gf2p\" (UniqueName: \"kubernetes.io/projected/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-kube-api-access-4gf2p\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.270630 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8bb80bf-d436-4bad-a3bf-b26dcc359766-host\") pod \"node-ca-wl6lv\" (UID: \"f8bb80bf-d436-4bad-a3bf-b26dcc359766\") " pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.270630 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270566 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-system-cni-dir\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.270630 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270601 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-ovnkube-config\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.270630 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270620 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.270784 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270635 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-os-release\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.270784 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270655 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-var-lib-cni-multus\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.270784 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-etc-kubernetes\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.270784 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270701 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htvgd\" (UniqueName: \"kubernetes.io/projected/aead7c8c-4dc1-4092-8e69-4f857803c825-kube-api-access-htvgd\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.270784 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-modprobe-d\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.270784 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270729 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.270784 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aead7c8c-4dc1-4092-8e69-4f857803c825-cni-binary-copy\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.270784 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270761 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-hostroot\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270796 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-systemd\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-run\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270873 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-var-lib-kubelet\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270880 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270932 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e7747e6a-0f9c-48fa-a8e9-4648a187366c-konnectivity-ca\") pod \"konnectivity-agent-mp6ht\" (UID: \"e7747e6a-0f9c-48fa-a8e9-4648a187366c\") " pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270961 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2jrs\" (UniqueName: \"kubernetes.io/projected/3848ad83-45cd-43a9-8803-52cd31ab6f05-kube-api-access-d2jrs\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.270988 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-slash\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-registration-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-run-systemd\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.271153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271150 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-cni-dir\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271182 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-sys\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271207 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-cni-netd\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271222 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-ovn-node-metrics-cert\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271241 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/15e19007-4d32-4de6-9d4e-fbfa1c190965-iptables-alerter-script\") pod \"iptables-alerter-rlmgg\" (UID: \"15e19007-4d32-4de6-9d4e-fbfa1c190965\") " pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271264 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-cnibin\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271287 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-etc-openvswitch\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-sys-fs\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-cnibin\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8bb80bf-d436-4bad-a3bf-b26dcc359766-serviceca\") pod \"node-ca-wl6lv\" (UID: \"f8bb80bf-d436-4bad-a3bf-b26dcc359766\") " pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271379 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3848ad83-45cd-43a9-8803-52cd31ab6f05-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271403 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-socket-dir-parent\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-run-multus-certs\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15e19007-4d32-4de6-9d4e-fbfa1c190965-host-slash\") pod \"iptables-alerter-rlmgg\" (UID: \"15e19007-4d32-4de6-9d4e-fbfa1c190965\") " pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271459 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-sysctl-conf\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271484 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-systemd-units\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.271564 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-node-log\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcd6q\" (UniqueName: \"kubernetes.io/projected/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-kube-api-access-lcd6q\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271543 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271558 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-var-lib-openvswitch\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-var-lib-kubelet\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-conf-dir\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmhbp\" (UniqueName: \"kubernetes.io/projected/925c84c5-beea-448d-af92-aa9ab7a10629-kube-api-access-xmhbp\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-cni-bin\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271720 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-kubernetes\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/925c84c5-beea-448d-af92-aa9ab7a10629-tmp\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271766 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-os-release\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3848ad83-45cd-43a9-8803-52cd31ab6f05-cni-binary-copy\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-kubelet\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271836 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-socket-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271859 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-sysctl-d\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-host\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.272268 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271941 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p92hv\" (UniqueName: \"kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv\") pod \"network-check-target-sbf9k\" (UID: \"02186aeb-7b19-4faf-885c-e060403bd058\") " pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-device-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.271999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-var-lib-cni-bin\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272033 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-daemon-config\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tglxx\" (UniqueName: \"kubernetes.io/projected/15e19007-4d32-4de6-9d4e-fbfa1c190965-kube-api-access-tglxx\") pod \"iptables-alerter-rlmgg\" (UID: \"15e19007-4d32-4de6-9d4e-fbfa1c190965\") " pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272090 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgkhp\" (UniqueName: \"kubernetes.io/projected/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-kube-api-access-zgkhp\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e7747e6a-0f9c-48fa-a8e9-4648a187366c-agent-certs\") pod \"konnectivity-agent-mp6ht\" (UID: \"e7747e6a-0f9c-48fa-a8e9-4648a187366c\") " pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-run-netns\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/925c84c5-beea-448d-af92-aa9ab7a10629-etc-tuned\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3848ad83-45cd-43a9-8803-52cd31ab6f05-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272214 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-log-socket\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-run-k8s-cni-cncf-io\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272291 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-sysconfig\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272347 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-lib-modules\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272367 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-run-netns\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.272891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-run-openvswitch\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.273422 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272439 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-run-ovn\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.273422 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-system-cni-dir\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.273422 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.272510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2c7d\" (UniqueName: \"kubernetes.io/projected/f8bb80bf-d436-4bad-a3bf-b26dcc359766-kube-api-access-q2c7d\") pod \"node-ca-wl6lv\" (UID: \"f8bb80bf-d436-4bad-a3bf-b26dcc359766\") " pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.273422 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:59.273377 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd22163f3b6268afb9f8cf0bead0178.slice/crio-709289039a2822c820f182bc0c8ae2ace4d74f1268b8b3c4d74a804374011918 WatchSource:0}: Error finding container 709289039a2822c820f182bc0c8ae2ace4d74f1268b8b3c4d74a804374011918: Status 404 returned error can't find the container with id 709289039a2822c820f182bc0c8ae2ace4d74f1268b8b3c4d74a804374011918 Apr 22 17:52:59.273603 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:59.273586 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b16281a969b22277334b5b3f7efb159.slice/crio-4ebee1faad8d875dac32cc7638e71884cffb81c0d32295a7863e8b9d2ef0d361 WatchSource:0}: Error finding container 4ebee1faad8d875dac32cc7638e71884cffb81c0d32295a7863e8b9d2ef0d361: Status 404 returned error can't find the container with id 4ebee1faad8d875dac32cc7638e71884cffb81c0d32295a7863e8b9d2ef0d361 Apr 22 17:52:59.279634 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.279425 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:52:59.279634 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.279609 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:52:59.296836 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.296816 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tzz2q" Apr 22 17:52:59.304241 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.304213 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:47:58 +0000 UTC" deadline="2028-01-29 20:08:05.20199161 +0000 UTC" Apr 22 17:52:59.304294 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.304243 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15530h15m5.897752229s" Apr 22 17:52:59.305836 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.305812 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tzz2q" Apr 22 17:52:59.365555 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.365512 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal" event={"ID":"5b16281a969b22277334b5b3f7efb159","Type":"ContainerStarted","Data":"4ebee1faad8d875dac32cc7638e71884cffb81c0d32295a7863e8b9d2ef0d361"} Apr 22 17:52:59.365694 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.365682 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:52:59.366381 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.366360 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" event={"ID":"cfd22163f3b6268afb9f8cf0bead0178","Type":"ContainerStarted","Data":"709289039a2822c820f182bc0c8ae2ace4d74f1268b8b3c4d74a804374011918"} Apr 22 17:52:59.373243 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-cnibin\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.373322 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-etc-openvswitch\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.373322 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-sys-fs\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.373322 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-cnibin\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.373450 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-cnibin\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.373450 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373386 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-etc-openvswitch\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.373450 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373423 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-cnibin\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.373450 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373442 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8bb80bf-d436-4bad-a3bf-b26dcc359766-serviceca\") pod \"node-ca-wl6lv\" (UID: \"f8bb80bf-d436-4bad-a3bf-b26dcc359766\") " pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3848ad83-45cd-43a9-8803-52cd31ab6f05-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-socket-dir-parent\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-run-multus-certs\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15e19007-4d32-4de6-9d4e-fbfa1c190965-host-slash\") pod \"iptables-alerter-rlmgg\" (UID: \"15e19007-4d32-4de6-9d4e-fbfa1c190965\") " pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373526 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-sysctl-conf\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-systemd-units\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-sys-fs\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-node-log\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-run-multus-certs\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373577 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/15e19007-4d32-4de6-9d4e-fbfa1c190965-host-slash\") pod \"iptables-alerter-rlmgg\" (UID: \"15e19007-4d32-4de6-9d4e-fbfa1c190965\") " pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcd6q\" (UniqueName: \"kubernetes.io/projected/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-kube-api-access-lcd6q\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.373627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-systemd-units\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.374127 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-sysctl-conf\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.374127 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373685 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-socket-dir-parent\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.374127 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373634 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-node-log\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.374127 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373781 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.374127 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8bb80bf-d436-4bad-a3bf-b26dcc359766-serviceca\") pod \"node-ca-wl6lv\" (UID: \"f8bb80bf-d436-4bad-a3bf-b26dcc359766\") " pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.374127 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.373612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.374127 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374028 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3848ad83-45cd-43a9-8803-52cd31ab6f05-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.374387 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-var-lib-openvswitch\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.374427 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-var-lib-kubelet\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.374475 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-conf-dir\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.374475 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmhbp\" (UniqueName: \"kubernetes.io/projected/925c84c5-beea-448d-af92-aa9ab7a10629-kube-api-access-xmhbp\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.374566 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-cni-bin\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.374566 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-var-lib-kubelet\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.374566 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-cni-bin\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.374566 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-var-lib-openvswitch\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.374566 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-conf-dir\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.374566 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-kubernetes\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/925c84c5-beea-448d-af92-aa9ab7a10629-tmp\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-os-release\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-kubernetes\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3848ad83-45cd-43a9-8803-52cd31ab6f05-cni-binary-copy\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-kubelet\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-socket-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374648 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-os-release\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-sysctl-d\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-host\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374692 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-kubelet\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p92hv\" (UniqueName: \"kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv\") pod \"network-check-target-sbf9k\" (UID: \"02186aeb-7b19-4faf-885c-e060403bd058\") " pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374747 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-host\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-device-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374774 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-var-lib-cni-bin\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-daemon-config\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374804 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-socket-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.374818 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-sysctl-d\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374821 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tglxx\" (UniqueName: \"kubernetes.io/projected/15e19007-4d32-4de6-9d4e-fbfa1c190965-kube-api-access-tglxx\") pod \"iptables-alerter-rlmgg\" (UID: \"15e19007-4d32-4de6-9d4e-fbfa1c190965\") " pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374804 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-device-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgkhp\" (UniqueName: \"kubernetes.io/projected/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-kube-api-access-zgkhp\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374864 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-var-lib-cni-bin\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374866 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e7747e6a-0f9c-48fa-a8e9-4648a187366c-agent-certs\") pod \"konnectivity-agent-mp6ht\" (UID: \"e7747e6a-0f9c-48fa-a8e9-4648a187366c\") " pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.374963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-run-netns\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/925c84c5-beea-448d-af92-aa9ab7a10629-etc-tuned\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375016 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-run-netns\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3848ad83-45cd-43a9-8803-52cd31ab6f05-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-log-socket\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3848ad83-45cd-43a9-8803-52cd31ab6f05-cni-binary-copy\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-run-k8s-cni-cncf-io\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375132 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-log-socket\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-sysconfig\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-lib-modules\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.375525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-run-netns\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-run-openvswitch\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-run-k8s-cni-cncf-io\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375585 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-run-ovn\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-run-openvswitch\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375509 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-run-ovn\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-system-cni-dir\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2c7d\" (UniqueName: \"kubernetes.io/projected/f8bb80bf-d436-4bad-a3bf-b26dcc359766-kube-api-access-q2c7d\") pod \"node-ca-wl6lv\" (UID: \"f8bb80bf-d436-4bad-a3bf-b26dcc359766\") " pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-env-overrides\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-ovnkube-script-lib\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gf2p\" (UniqueName: \"kubernetes.io/projected/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-kube-api-access-4gf2p\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375791 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-daemon-config\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8bb80bf-d436-4bad-a3bf-b26dcc359766-host\") pod \"node-ca-wl6lv\" (UID: \"f8bb80bf-d436-4bad-a3bf-b26dcc359766\") " pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-run-netns\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375972 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-lib-modules\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-system-cni-dir\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.376283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.375792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8bb80bf-d436-4bad-a3bf-b26dcc359766-host\") pod \"node-ca-wl6lv\" (UID: \"f8bb80bf-d436-4bad-a3bf-b26dcc359766\") " pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-system-cni-dir\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376371 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-ovnkube-config\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3848ad83-45cd-43a9-8803-52cd31ab6f05-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-os-release\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-var-lib-cni-multus\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-etc-kubernetes\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htvgd\" (UniqueName: \"kubernetes.io/projected/aead7c8c-4dc1-4092-8e69-4f857803c825-kube-api-access-htvgd\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-modprobe-d\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-sysconfig\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376613 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-host-var-lib-cni-multus\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376635 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-etc-kubernetes\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-etc-selinux\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-modprobe-d\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.377056 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.376791 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.377742 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-env-overrides\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.377742 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.377742 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-ovnkube-config\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.377742 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377353 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aead7c8c-4dc1-4092-8e69-4f857803c825-cni-binary-copy\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.377742 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-hostroot\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.377742 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377654 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-systemd\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.377742 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377693 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-run\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.377742 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-var-lib-kubelet\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aead7c8c-4dc1-4092-8e69-4f857803c825-cni-binary-copy\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e7747e6a-0f9c-48fa-a8e9-4648a187366c-konnectivity-ca\") pod \"konnectivity-agent-mp6ht\" (UID: \"e7747e6a-0f9c-48fa-a8e9-4648a187366c\") " pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377832 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2jrs\" (UniqueName: \"kubernetes.io/projected/3848ad83-45cd-43a9-8803-52cd31ab6f05-kube-api-access-d2jrs\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-run\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377926 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-slash\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.377992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-registration-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-run-systemd\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378049 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-cni-dir\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.378082 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-sys\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.378478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378105 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-cni-netd\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-ovn-node-metrics-cert\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-var-lib-kubelet\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.378478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/15e19007-4d32-4de6-9d4e-fbfa1c190965-iptables-alerter-script\") pod \"iptables-alerter-rlmgg\" (UID: \"15e19007-4d32-4de6-9d4e-fbfa1c190965\") " pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.378478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/925c84c5-beea-448d-af92-aa9ab7a10629-etc-tuned\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.378478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378343 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/925c84c5-beea-448d-af92-aa9ab7a10629-tmp\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.378478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378376 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-run-systemd\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-etc-systemd\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.378478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378441 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-slash\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378750 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378496 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/925c84c5-beea-448d-af92-aa9ab7a10629-sys\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.378750 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378541 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-hostroot\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.378750 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3848ad83-45cd-43a9-8803-52cd31ab6f05-system-cni-dir\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.378750 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378649 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-cni-netd\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378750 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e7747e6a-0f9c-48fa-a8e9-4648a187366c-konnectivity-ca\") pod \"konnectivity-agent-mp6ht\" (UID: \"e7747e6a-0f9c-48fa-a8e9-4648a187366c\") " pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:52:59.378750 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378705 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-multus-cni-dir\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.378981 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378776 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-registration-dir\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.378981 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aead7c8c-4dc1-4092-8e69-4f857803c825-os-release\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.378981 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.378841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.378981 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.378859 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:59.379155 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.379026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/15e19007-4d32-4de6-9d4e-fbfa1c190965-iptables-alerter-script\") pod \"iptables-alerter-rlmgg\" (UID: \"15e19007-4d32-4de6-9d4e-fbfa1c190965\") " pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.379155 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.379045 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs podName:8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c nodeName:}" failed. No retries permitted until 2026-04-22 17:52:59.879003382 +0000 UTC m=+2.001113560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs") pod "network-metrics-daemon-hgdxx" (UID: "8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:59.380214 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.380193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-ovnkube-script-lib\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.380739 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.380718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e7747e6a-0f9c-48fa-a8e9-4648a187366c-agent-certs\") pod \"konnectivity-agent-mp6ht\" (UID: \"e7747e6a-0f9c-48fa-a8e9-4648a187366c\") " pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:52:59.380981 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.380960 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-ovn-node-metrics-cert\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.383443 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.383428 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:52:59.383523 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.383447 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:52:59.383523 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.383459 2574 projected.go:194] Error preparing data for projected volume kube-api-access-p92hv for pod openshift-network-diagnostics/network-check-target-sbf9k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:59.383612 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.383538 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv podName:02186aeb-7b19-4faf-885c-e060403bd058 nodeName:}" failed. No retries permitted until 2026-04-22 17:52:59.883504296 +0000 UTC m=+2.005614465 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p92hv" (UniqueName: "kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv") pod "network-check-target-sbf9k" (UID: "02186aeb-7b19-4faf-885c-e060403bd058") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:59.385310 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.385288 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tglxx\" (UniqueName: \"kubernetes.io/projected/15e19007-4d32-4de6-9d4e-fbfa1c190965-kube-api-access-tglxx\") pod \"iptables-alerter-rlmgg\" (UID: \"15e19007-4d32-4de6-9d4e-fbfa1c190965\") " pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.386100 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.386077 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmhbp\" (UniqueName: \"kubernetes.io/projected/925c84c5-beea-448d-af92-aa9ab7a10629-kube-api-access-xmhbp\") pod \"tuned-gtxxh\" (UID: \"925c84c5-beea-448d-af92-aa9ab7a10629\") " pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.386190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.386141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2c7d\" (UniqueName: \"kubernetes.io/projected/f8bb80bf-d436-4bad-a3bf-b26dcc359766-kube-api-access-q2c7d\") pod \"node-ca-wl6lv\" (UID: \"f8bb80bf-d436-4bad-a3bf-b26dcc359766\") " pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.386435 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.386412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcd6q\" (UniqueName: \"kubernetes.io/projected/13e0eee8-83ad-4d31-b78a-f91ab1f96dac-kube-api-access-lcd6q\") pod \"aws-ebs-csi-driver-node-pvlfk\" (UID: \"13e0eee8-83ad-4d31-b78a-f91ab1f96dac\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.387362 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.387321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htvgd\" (UniqueName: \"kubernetes.io/projected/aead7c8c-4dc1-4092-8e69-4f857803c825-kube-api-access-htvgd\") pod \"multus-555gb\" (UID: \"aead7c8c-4dc1-4092-8e69-4f857803c825\") " pod="openshift-multus/multus-555gb" Apr 22 17:52:59.387606 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.387579 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gf2p\" (UniqueName: \"kubernetes.io/projected/b77e864f-03d8-422c-8cfd-a0a44bbce6e2-kube-api-access-4gf2p\") pod \"ovnkube-node-qv8jk\" (UID: \"b77e864f-03d8-422c-8cfd-a0a44bbce6e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.387810 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.387793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2jrs\" (UniqueName: \"kubernetes.io/projected/3848ad83-45cd-43a9-8803-52cd31ab6f05-kube-api-access-d2jrs\") pod \"multus-additional-cni-plugins-v2plq\" (UID: \"3848ad83-45cd-43a9-8803-52cd31ab6f05\") " pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.387960 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.387944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgkhp\" (UniqueName: \"kubernetes.io/projected/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-kube-api-access-zgkhp\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:52:59.584126 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.584054 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-555gb" Apr 22 17:52:59.589727 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.589704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v2plq" Apr 22 17:52:59.590856 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:59.590829 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaead7c8c_4dc1_4092_8e69_4f857803c825.slice/crio-c5ab93aee466b6a6618bcd34b6e2445a36c9768eebbbd540368e01444222aed4 WatchSource:0}: Error finding container c5ab93aee466b6a6618bcd34b6e2445a36c9768eebbbd540368e01444222aed4: Status 404 returned error can't find the container with id c5ab93aee466b6a6618bcd34b6e2445a36c9768eebbbd540368e01444222aed4 Apr 22 17:52:59.595760 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:59.595739 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3848ad83_45cd_43a9_8803_52cd31ab6f05.slice/crio-cb9299baebc306d06a4f21476e8d1a87d34b714f7da30f81d297527b53dd8f6f WatchSource:0}: Error finding container cb9299baebc306d06a4f21476e8d1a87d34b714f7da30f81d297527b53dd8f6f: Status 404 returned error can't find the container with id cb9299baebc306d06a4f21476e8d1a87d34b714f7da30f81d297527b53dd8f6f Apr 22 17:52:59.596348 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.596316 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rlmgg" Apr 22 17:52:59.600737 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.600722 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:52:59.605681 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.605663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" Apr 22 17:52:59.607388 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:59.607366 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb77e864f_03d8_422c_8cfd_a0a44bbce6e2.slice/crio-f3fb4083ee05f8c07e259469bccc130f8521190f60bc74d110c71c7eb7c1e0ab WatchSource:0}: Error finding container f3fb4083ee05f8c07e259469bccc130f8521190f60bc74d110c71c7eb7c1e0ab: Status 404 returned error can't find the container with id f3fb4083ee05f8c07e259469bccc130f8521190f60bc74d110c71c7eb7c1e0ab Apr 22 17:52:59.610884 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.610836 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" Apr 22 17:52:59.612772 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:59.612754 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13e0eee8_83ad_4d31_b78a_f91ab1f96dac.slice/crio-9598c22845f15139f38018f501c201cc5354ec1aaa0914e5cbb71a1891a76b99 WatchSource:0}: Error finding container 9598c22845f15139f38018f501c201cc5354ec1aaa0914e5cbb71a1891a76b99: Status 404 returned error can't find the container with id 9598c22845f15139f38018f501c201cc5354ec1aaa0914e5cbb71a1891a76b99 Apr 22 17:52:59.615998 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.615974 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:52:59.616630 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:59.616609 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod925c84c5_beea_448d_af92_aa9ab7a10629.slice/crio-7ab7846559e7d3e460679441fbf12cf857119934d74d3ad35948aff0433e2a9a WatchSource:0}: Error finding container 7ab7846559e7d3e460679441fbf12cf857119934d74d3ad35948aff0433e2a9a: Status 404 returned error can't find the container with id 7ab7846559e7d3e460679441fbf12cf857119934d74d3ad35948aff0433e2a9a Apr 22 17:52:59.620968 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.620115 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wl6lv" Apr 22 17:52:59.623126 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:59.623104 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7747e6a_0f9c_48fa_a8e9_4648a187366c.slice/crio-df83ffcb9f49780420ff4135f101accdf745ace5f8d1bdfd88e19a2c4d396ccb WatchSource:0}: Error finding container df83ffcb9f49780420ff4135f101accdf745ace5f8d1bdfd88e19a2c4d396ccb: Status 404 returned error can't find the container with id df83ffcb9f49780420ff4135f101accdf745ace5f8d1bdfd88e19a2c4d396ccb Apr 22 17:52:59.627321 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:52:59.627298 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8bb80bf_d436_4bad_a3bf_b26dcc359766.slice/crio-a80daa662710489d6d300244a6d65cc070662a742e317ac0989e5de2f49da569 WatchSource:0}: Error finding container a80daa662710489d6d300244a6d65cc070662a742e317ac0989e5de2f49da569: Status 404 returned error can't find the container with id a80daa662710489d6d300244a6d65cc070662a742e317ac0989e5de2f49da569 Apr 22 17:52:59.699153 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.699124 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:52:59.815523 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.815482 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:52:59.881679 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.881597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:52:59.881830 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.881743 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:59.881830 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.881800 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs podName:8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:00.881782597 +0000 UTC m=+3.003892762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs") pod "network-metrics-daemon-hgdxx" (UID: "8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:59.982206 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:52:59.982172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p92hv\" (UniqueName: \"kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv\") pod \"network-check-target-sbf9k\" (UID: \"02186aeb-7b19-4faf-885c-e060403bd058\") " pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:52:59.982403 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.982385 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:52:59.982480 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.982413 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:52:59.982480 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.982427 2574 projected.go:194] Error preparing data for projected volume kube-api-access-p92hv for pod openshift-network-diagnostics/network-check-target-sbf9k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:59.982581 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:52:59.982506 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv podName:02186aeb-7b19-4faf-885c-e060403bd058 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:00.982462505 +0000 UTC m=+3.104572673 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-p92hv" (UniqueName: "kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv") pod "network-check-target-sbf9k" (UID: "02186aeb-7b19-4faf-885c-e060403bd058") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:00.180403 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.164391 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:00.307424 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.307323 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:47:59 +0000 UTC" deadline="2028-01-26 22:22:10.811264942 +0000 UTC" Apr 22 17:53:00.307424 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.307371 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15460h29m10.503898425s" Apr 22 17:53:00.386382 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.386294 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" event={"ID":"925c84c5-beea-448d-af92-aa9ab7a10629","Type":"ContainerStarted","Data":"7ab7846559e7d3e460679441fbf12cf857119934d74d3ad35948aff0433e2a9a"} Apr 22 17:53:00.407441 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.407376 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" event={"ID":"13e0eee8-83ad-4d31-b78a-f91ab1f96dac","Type":"ContainerStarted","Data":"9598c22845f15139f38018f501c201cc5354ec1aaa0914e5cbb71a1891a76b99"} Apr 22 17:53:00.417092 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.417041 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rlmgg" event={"ID":"15e19007-4d32-4de6-9d4e-fbfa1c190965","Type":"ContainerStarted","Data":"5287e5a9a0031965dbec2e27715ce750402c8882ea0a52cc3b58a2afaeaec390"} Apr 22 17:53:00.435619 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.435514 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555gb" event={"ID":"aead7c8c-4dc1-4092-8e69-4f857803c825","Type":"ContainerStarted","Data":"c5ab93aee466b6a6618bcd34b6e2445a36c9768eebbbd540368e01444222aed4"} Apr 22 17:53:00.438702 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.438655 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wl6lv" event={"ID":"f8bb80bf-d436-4bad-a3bf-b26dcc359766","Type":"ContainerStarted","Data":"a80daa662710489d6d300244a6d65cc070662a742e317ac0989e5de2f49da569"} Apr 22 17:53:00.446614 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.446567 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mp6ht" event={"ID":"e7747e6a-0f9c-48fa-a8e9-4648a187366c","Type":"ContainerStarted","Data":"df83ffcb9f49780420ff4135f101accdf745ace5f8d1bdfd88e19a2c4d396ccb"} Apr 22 17:53:00.451438 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.451318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerStarted","Data":"f3fb4083ee05f8c07e259469bccc130f8521190f60bc74d110c71c7eb7c1e0ab"} Apr 22 17:53:00.471051 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.471018 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v2plq" event={"ID":"3848ad83-45cd-43a9-8803-52cd31ab6f05","Type":"ContainerStarted","Data":"cb9299baebc306d06a4f21476e8d1a87d34b714f7da30f81d297527b53dd8f6f"} Apr 22 17:53:00.889805 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.889492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:00.889805 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:00.889658 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:00.890034 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:00.889871 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs podName:8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:02.889846134 +0000 UTC m=+5.011956287 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs") pod "network-metrics-daemon-hgdxx" (UID: "8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:00.990368 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:00.990315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p92hv\" (UniqueName: \"kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv\") pod \"network-check-target-sbf9k\" (UID: \"02186aeb-7b19-4faf-885c-e060403bd058\") " pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:00.990569 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:00.990489 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:00.990569 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:00.990509 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:00.990569 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:00.990521 2574 projected.go:194] Error preparing data for projected volume kube-api-access-p92hv for pod openshift-network-diagnostics/network-check-target-sbf9k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:00.990724 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:00.990578 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv podName:02186aeb-7b19-4faf-885c-e060403bd058 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:02.990559597 +0000 UTC m=+5.112669764 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-p92hv" (UniqueName: "kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv") pod "network-check-target-sbf9k" (UID: "02186aeb-7b19-4faf-885c-e060403bd058") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:01.308289 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:01.308197 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:47:59 +0000 UTC" deadline="2027-12-25 20:01:37.253336595 +0000 UTC" Apr 22 17:53:01.308289 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:01.308241 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14690h8m35.945099288s" Apr 22 17:53:01.363373 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:01.363340 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:01.363554 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:01.363470 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:01.363735 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:01.363707 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:01.363853 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:01.363831 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:02.269345 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:02.264424 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:02.908871 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:02.908839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:02.909349 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:02.909011 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:02.909349 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:02.909073 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs podName:8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:06.909052879 +0000 UTC m=+9.031163048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs") pod "network-metrics-daemon-hgdxx" (UID: "8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:03.009949 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:03.009883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p92hv\" (UniqueName: \"kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv\") pod \"network-check-target-sbf9k\" (UID: \"02186aeb-7b19-4faf-885c-e060403bd058\") " pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:03.010135 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:03.010032 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:03.010135 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:03.010059 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:03.010135 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:03.010072 2574 projected.go:194] Error preparing data for projected volume kube-api-access-p92hv for pod openshift-network-diagnostics/network-check-target-sbf9k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:03.010285 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:03.010139 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv podName:02186aeb-7b19-4faf-885c-e060403bd058 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:07.010120486 +0000 UTC m=+9.132230648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-p92hv" (UniqueName: "kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv") pod "network-check-target-sbf9k" (UID: "02186aeb-7b19-4faf-885c-e060403bd058") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:03.363586 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:03.363552 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:03.363741 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:03.363679 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:03.363828 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:03.363750 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:03.363883 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:03.363844 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:05.363650 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:05.363609 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:05.363650 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:05.363627 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:05.364135 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:05.363772 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:05.364260 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:05.364230 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:06.059888 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.059854 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wvndp"] Apr 22 17:53:06.064157 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.063972 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.064157 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:06.064060 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:06.137576 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.137412 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e29345c1-1d57-44eb-aacb-0f51d483baf0-dbus\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.137576 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.137487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.137576 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.137526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e29345c1-1d57-44eb-aacb-0f51d483baf0-kubelet-config\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.237877 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.237839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.238063 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.237888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e29345c1-1d57-44eb-aacb-0f51d483baf0-kubelet-config\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.238063 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.237935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e29345c1-1d57-44eb-aacb-0f51d483baf0-dbus\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.238063 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:06.238020 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:06.238213 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:06.238090 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret podName:e29345c1-1d57-44eb-aacb-0f51d483baf0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:06.73807077 +0000 UTC m=+8.860180928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret") pod "global-pull-secret-syncer-wvndp" (UID: "e29345c1-1d57-44eb-aacb-0f51d483baf0") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:06.238213 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.238103 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e29345c1-1d57-44eb-aacb-0f51d483baf0-dbus\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.238213 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.238114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e29345c1-1d57-44eb-aacb-0f51d483baf0-kubelet-config\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.741963 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.741922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:06.742443 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:06.742058 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:06.742443 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:06.742121 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret podName:e29345c1-1d57-44eb-aacb-0f51d483baf0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:07.742102746 +0000 UTC m=+9.864212906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret") pod "global-pull-secret-syncer-wvndp" (UID: "e29345c1-1d57-44eb-aacb-0f51d483baf0") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:06.942858 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:06.942822 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:06.943043 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:06.942986 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:06.943110 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:06.943061 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs podName:8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:14.943040996 +0000 UTC m=+17.065151154 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs") pod "network-metrics-daemon-hgdxx" (UID: "8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:07.043457 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:07.043368 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p92hv\" (UniqueName: \"kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv\") pod \"network-check-target-sbf9k\" (UID: \"02186aeb-7b19-4faf-885c-e060403bd058\") " pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:07.043617 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:07.043579 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:07.043617 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:07.043604 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:07.043617 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:07.043618 2574 projected.go:194] Error preparing data for projected volume kube-api-access-p92hv for pod openshift-network-diagnostics/network-check-target-sbf9k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:07.043781 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:07.043685 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv podName:02186aeb-7b19-4faf-885c-e060403bd058 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:15.043666874 +0000 UTC m=+17.165777043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-p92hv" (UniqueName: "kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv") pod "network-check-target-sbf9k" (UID: "02186aeb-7b19-4faf-885c-e060403bd058") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:07.363649 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:07.363574 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:07.363810 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:07.363579 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:07.363810 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:07.363696 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:07.363909 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:07.363809 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:07.749158 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:07.749046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:07.749586 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:07.749212 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:07.749586 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:07.749290 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret podName:e29345c1-1d57-44eb-aacb-0f51d483baf0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:09.749266012 +0000 UTC m=+11.871376176 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret") pod "global-pull-secret-syncer-wvndp" (UID: "e29345c1-1d57-44eb-aacb-0f51d483baf0") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:08.364140 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:08.363963 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:08.364140 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:08.364080 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:09.363742 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:09.363667 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:09.364123 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:09.363678 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:09.364123 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:09.363785 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:09.364123 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:09.363862 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:09.764130 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:09.764048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:09.764272 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:09.764165 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:09.764272 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:09.764224 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret podName:e29345c1-1d57-44eb-aacb-0f51d483baf0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:13.764203879 +0000 UTC m=+15.886314037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret") pod "global-pull-secret-syncer-wvndp" (UID: "e29345c1-1d57-44eb-aacb-0f51d483baf0") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:10.363202 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:10.363171 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:10.363405 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:10.363342 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:11.363970 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:11.363933 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:11.364448 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:11.364050 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:11.364448 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:11.364113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:11.364448 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:11.364237 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:12.363715 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:12.363686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:12.363872 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:12.363798 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:13.363465 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:13.363443 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:13.363714 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:13.363448 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:13.363714 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:13.363528 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:13.363714 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:13.363622 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:13.791633 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:13.791560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:13.791771 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:13.791704 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:13.791825 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:13.791771 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret podName:e29345c1-1d57-44eb-aacb-0f51d483baf0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:21.791752356 +0000 UTC m=+23.913862530 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret") pod "global-pull-secret-syncer-wvndp" (UID: "e29345c1-1d57-44eb-aacb-0f51d483baf0") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:14.363938 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:14.363897 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:14.364311 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:14.364038 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:15.001036 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:15.001009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:15.001234 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:15.001153 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:15.001234 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:15.001213 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs podName:8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c nodeName:}" failed. No retries permitted until 2026-04-22 17:53:31.001195818 +0000 UTC m=+33.123305985 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs") pod "network-metrics-daemon-hgdxx" (UID: "8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:15.102344 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:15.102298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p92hv\" (UniqueName: \"kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv\") pod \"network-check-target-sbf9k\" (UID: \"02186aeb-7b19-4faf-885c-e060403bd058\") " pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:15.102508 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:15.102486 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:15.102586 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:15.102510 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:15.102586 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:15.102523 2574 projected.go:194] Error preparing data for projected volume kube-api-access-p92hv for pod openshift-network-diagnostics/network-check-target-sbf9k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:15.102686 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:15.102585 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv podName:02186aeb-7b19-4faf-885c-e060403bd058 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:31.102565599 +0000 UTC m=+33.224675765 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-p92hv" (UniqueName: "kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv") pod "network-check-target-sbf9k" (UID: "02186aeb-7b19-4faf-885c-e060403bd058") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:15.363260 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:15.363181 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:15.363516 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:15.363315 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:15.363720 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:15.363653 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:15.364115 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:15.364066 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:16.363354 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:16.363308 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:16.363548 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:16.363451 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:17.363912 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:17.363890 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:17.364232 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:17.363895 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:17.364232 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:17.363999 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:17.364232 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:17.364096 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:17.499663 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:17.499624 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal" event={"ID":"5b16281a969b22277334b5b3f7efb159","Type":"ContainerStarted","Data":"a83f06b088977a9e0607cd781069b969c767c0131db84015601c6c6580a2c22d"} Apr 22 17:53:17.502943 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:17.502913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerStarted","Data":"7ab0059ebcb502d027b5c072100db72abe9fe076ee37dbca80b940d81df7f098"} Apr 22 17:53:17.515270 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:17.515220 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-143.ec2.internal" podStartSLOduration=18.515182413 podStartE2EDuration="18.515182413s" podCreationTimestamp="2026-04-22 17:52:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:17.514526196 +0000 UTC m=+19.636636370" watchObservedRunningTime="2026-04-22 17:53:17.515182413 +0000 UTC m=+19.637292585" Apr 22 17:53:18.363954 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.363627 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:18.364497 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:18.363999 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:18.505963 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.505931 2574 generic.go:358] "Generic (PLEG): container finished" podID="cfd22163f3b6268afb9f8cf0bead0178" containerID="771963e0a313ea87fe3ae3e0af398885b94adb81fdc31bb56e6b7f2e33fcf73f" exitCode=0 Apr 22 17:53:18.506195 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.506007 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" event={"ID":"cfd22163f3b6268afb9f8cf0bead0178","Type":"ContainerDied","Data":"771963e0a313ea87fe3ae3e0af398885b94adb81fdc31bb56e6b7f2e33fcf73f"} Apr 22 17:53:18.507563 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.507533 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wl6lv" event={"ID":"f8bb80bf-d436-4bad-a3bf-b26dcc359766","Type":"ContainerStarted","Data":"b911de3e2a11b917cbdc94f86471a56ebded3ab936cdd463993690f2e35c3041"} Apr 22 17:53:18.509046 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.509017 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mp6ht" event={"ID":"e7747e6a-0f9c-48fa-a8e9-4648a187366c","Type":"ContainerStarted","Data":"75054540d3a7de8a8f655a8e7cc48a434c0ccb50a4e4b5763a9b518d033f4bcc"} Apr 22 17:53:18.511907 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.511861 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 17:53:18.512432 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.512407 2574 generic.go:358] "Generic (PLEG): container finished" podID="b77e864f-03d8-422c-8cfd-a0a44bbce6e2" containerID="de986b50cc64445a6df7ca29afcbf5f06311419b412d55c5afc60db9a79ce7e9" exitCode=1 Apr 22 17:53:18.512535 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.512492 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerStarted","Data":"8e9e9f86159736ed475479a7d3f94eccc4b647bbdcd1946e0c14124949e98e76"} Apr 22 17:53:18.512535 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.512521 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerStarted","Data":"52cff19c224f0d554e20e0c77808bbba831ca7a2bd951593bf6ddec207618fd6"} Apr 22 17:53:18.512535 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.512531 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerStarted","Data":"a8092a7e1367f49b1b3dc677ec53a69fcb402c6c1600095c7ab42f0ed5cd602b"} Apr 22 17:53:18.512697 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.512540 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerStarted","Data":"4a829253c763afba4cb25d6342a8f965269b131669b0dfad2175dc5ab547540f"} Apr 22 17:53:18.512697 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.512548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerDied","Data":"de986b50cc64445a6df7ca29afcbf5f06311419b412d55c5afc60db9a79ce7e9"} Apr 22 17:53:18.513837 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.513817 2574 generic.go:358] "Generic (PLEG): container finished" podID="3848ad83-45cd-43a9-8803-52cd31ab6f05" containerID="1ca7c1ff10a0629fab13b145fdf821da09e414a5153ee70b8e444721743e3d68" exitCode=0 Apr 22 17:53:18.513946 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.513892 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v2plq" event={"ID":"3848ad83-45cd-43a9-8803-52cd31ab6f05","Type":"ContainerDied","Data":"1ca7c1ff10a0629fab13b145fdf821da09e414a5153ee70b8e444721743e3d68"} Apr 22 17:53:18.515371 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.515353 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" event={"ID":"925c84c5-beea-448d-af92-aa9ab7a10629","Type":"ContainerStarted","Data":"8e484145e8e60ac5581a92a382984f993545d77e01f11c29cf15b704a1f27433"} Apr 22 17:53:18.516887 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.516866 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" event={"ID":"13e0eee8-83ad-4d31-b78a-f91ab1f96dac","Type":"ContainerStarted","Data":"e39aea10c39c7fef4aff15d4bf3311fae197fd2c144e3c316668345b1f70a2e6"} Apr 22 17:53:18.518233 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.518210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-555gb" event={"ID":"aead7c8c-4dc1-4092-8e69-4f857803c825","Type":"ContainerStarted","Data":"76531f3b8859c86d4775ffedd2ff533cb6c1316edae8bba447dc2c62aec91e97"} Apr 22 17:53:18.548849 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.548801 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gtxxh" podStartSLOduration=2.902829644 podStartE2EDuration="20.548786454s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:52:59.618820073 +0000 UTC m=+1.740930226" lastFinishedPulling="2026-04-22 17:53:17.264776877 +0000 UTC m=+19.386887036" observedRunningTime="2026-04-22 17:53:18.531864558 +0000 UTC m=+20.653974731" watchObservedRunningTime="2026-04-22 17:53:18.548786454 +0000 UTC m=+20.670896630" Apr 22 17:53:18.568987 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.568941 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mp6ht" podStartSLOduration=2.930107845 podStartE2EDuration="20.568926696s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:52:59.624468282 +0000 UTC m=+1.746578435" lastFinishedPulling="2026-04-22 17:53:17.263287116 +0000 UTC m=+19.385397286" observedRunningTime="2026-04-22 17:53:18.566452362 +0000 UTC m=+20.688562538" watchObservedRunningTime="2026-04-22 17:53:18.568926696 +0000 UTC m=+20.691036867" Apr 22 17:53:18.582678 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.582633 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wl6lv" podStartSLOduration=2.972879998 podStartE2EDuration="20.582617991s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:52:59.628698028 +0000 UTC m=+1.750808194" lastFinishedPulling="2026-04-22 17:53:17.23843602 +0000 UTC m=+19.360546187" observedRunningTime="2026-04-22 17:53:18.581917133 +0000 UTC m=+20.704027309" watchObservedRunningTime="2026-04-22 17:53:18.582617991 +0000 UTC m=+20.704728166" Apr 22 17:53:18.596883 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.596849 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-555gb" podStartSLOduration=2.6820969789999998 podStartE2EDuration="20.596839913s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:52:59.592812255 +0000 UTC m=+1.714922409" lastFinishedPulling="2026-04-22 17:53:17.50755519 +0000 UTC m=+19.629665343" observedRunningTime="2026-04-22 17:53:18.596648968 +0000 UTC m=+20.718759142" watchObservedRunningTime="2026-04-22 17:53:18.596839913 +0000 UTC m=+20.718950089" Apr 22 17:53:18.916272 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:18.916248 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:53:19.333116 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.332952 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:53:18.916268833Z","UUID":"491b1ae0-e527-46c1-8767-29fa6ab578c2","Handler":null,"Name":"","Endpoint":""} Apr 22 17:53:19.335368 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.335346 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:53:19.335368 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.335374 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:53:19.363536 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.363508 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:19.363650 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.363508 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:19.363709 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:19.363647 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:19.363761 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:19.363708 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:19.522423 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.522372 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" event={"ID":"cfd22163f3b6268afb9f8cf0bead0178","Type":"ContainerStarted","Data":"90f535b6b9b3a8f83dcbb8ebdf68b114eec5283f10130e75fe476a53db5a61dc"} Apr 22 17:53:19.524712 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.524684 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" event={"ID":"13e0eee8-83ad-4d31-b78a-f91ab1f96dac","Type":"ContainerStarted","Data":"3e2f255e693a205eed789d4b62cd0c06c161b2e6d268a2dd32c5a06e57a17fb0"} Apr 22 17:53:19.526307 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.526277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rlmgg" event={"ID":"15e19007-4d32-4de6-9d4e-fbfa1c190965","Type":"ContainerStarted","Data":"ca7f2292859dcff1b6f53fbbef96410e48aaba26fe71d3dc59606d15cf60b58b"} Apr 22 17:53:19.542553 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.542514 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-143.ec2.internal" podStartSLOduration=20.542503597 podStartE2EDuration="20.542503597s" podCreationTimestamp="2026-04-22 17:52:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:19.541834954 +0000 UTC m=+21.663945128" watchObservedRunningTime="2026-04-22 17:53:19.542503597 +0000 UTC m=+21.664613772" Apr 22 17:53:19.556947 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:19.556899 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rlmgg" podStartSLOduration=3.897863639 podStartE2EDuration="21.556884067s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:52:59.604284959 +0000 UTC m=+1.726395116" lastFinishedPulling="2026-04-22 17:53:17.263305389 +0000 UTC m=+19.385415544" observedRunningTime="2026-04-22 17:53:19.556750769 +0000 UTC m=+21.678860945" watchObservedRunningTime="2026-04-22 17:53:19.556884067 +0000 UTC m=+21.678994247" Apr 22 17:53:20.363600 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.363566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:20.363844 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:20.363681 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:20.463136 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.463100 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8jbmf"] Apr 22 17:53:20.466154 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.466132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.469041 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.469009 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:53:20.469145 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.469008 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mwpxm\"" Apr 22 17:53:20.469145 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.469062 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:53:20.529786 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.529740 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" event={"ID":"13e0eee8-83ad-4d31-b78a-f91ab1f96dac","Type":"ContainerStarted","Data":"96dde0d90c02341f92f139b89ed301a3f2193074b671eac06f6bcbc0ab49c22b"} Apr 22 17:53:20.532766 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.532749 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 17:53:20.533117 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.533066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerStarted","Data":"710b308448f49070c38a1c9f6d50bfac148c157397d3746df9984b6086e47d5a"} Apr 22 17:53:20.543769 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.543750 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjjkc\" (UniqueName: \"kubernetes.io/projected/0e63511d-3bbf-4d8b-bff0-7c1cc73c694c-kube-api-access-sjjkc\") pod \"node-resolver-8jbmf\" (UID: \"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c\") " pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.543883 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.543823 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0e63511d-3bbf-4d8b-bff0-7c1cc73c694c-tmp-dir\") pod \"node-resolver-8jbmf\" (UID: \"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c\") " pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.543883 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.543857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0e63511d-3bbf-4d8b-bff0-7c1cc73c694c-hosts-file\") pod \"node-resolver-8jbmf\" (UID: \"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c\") " pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.644253 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.644175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0e63511d-3bbf-4d8b-bff0-7c1cc73c694c-tmp-dir\") pod \"node-resolver-8jbmf\" (UID: \"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c\") " pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.644253 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.644234 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0e63511d-3bbf-4d8b-bff0-7c1cc73c694c-hosts-file\") pod \"node-resolver-8jbmf\" (UID: \"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c\") " pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.644488 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.644312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjjkc\" (UniqueName: \"kubernetes.io/projected/0e63511d-3bbf-4d8b-bff0-7c1cc73c694c-kube-api-access-sjjkc\") pod \"node-resolver-8jbmf\" (UID: \"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c\") " pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.644488 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.644380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0e63511d-3bbf-4d8b-bff0-7c1cc73c694c-hosts-file\") pod \"node-resolver-8jbmf\" (UID: \"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c\") " pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.644972 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.644947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0e63511d-3bbf-4d8b-bff0-7c1cc73c694c-tmp-dir\") pod \"node-resolver-8jbmf\" (UID: \"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c\") " pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.655075 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.655054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjjkc\" (UniqueName: \"kubernetes.io/projected/0e63511d-3bbf-4d8b-bff0-7c1cc73c694c-kube-api-access-sjjkc\") pod \"node-resolver-8jbmf\" (UID: \"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c\") " pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.776281 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:20.776251 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8jbmf" Apr 22 17:53:20.785712 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:53:20.785581 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e63511d_3bbf_4d8b_bff0_7c1cc73c694c.slice/crio-1a77844056833f930daaca19bc8b4275a5a63a81b3973182a3a3da5f28d817d4 WatchSource:0}: Error finding container 1a77844056833f930daaca19bc8b4275a5a63a81b3973182a3a3da5f28d817d4: Status 404 returned error can't find the container with id 1a77844056833f930daaca19bc8b4275a5a63a81b3973182a3a3da5f28d817d4 Apr 22 17:53:21.363190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:21.363152 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:21.363402 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:21.363192 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:21.363402 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:21.363286 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:21.363495 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:21.363427 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:21.536307 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:21.536276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8jbmf" event={"ID":"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c","Type":"ContainerStarted","Data":"1a77844056833f930daaca19bc8b4275a5a63a81b3973182a3a3da5f28d817d4"} Apr 22 17:53:21.851677 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:21.851604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:21.851839 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:21.851717 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:21.851839 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:21.851775 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret podName:e29345c1-1d57-44eb-aacb-0f51d483baf0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:37.851758402 +0000 UTC m=+39.973868571 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret") pod "global-pull-secret-syncer-wvndp" (UID: "e29345c1-1d57-44eb-aacb-0f51d483baf0") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:22.357187 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.357158 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:53:22.357863 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.357843 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:53:22.363045 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.363025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:22.363165 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:22.363109 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:22.373952 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.373899 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pvlfk" podStartSLOduration=4.305546089 podStartE2EDuration="24.373883577s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:52:59.613972983 +0000 UTC m=+1.736083135" lastFinishedPulling="2026-04-22 17:53:19.682310465 +0000 UTC m=+21.804420623" observedRunningTime="2026-04-22 17:53:20.548915835 +0000 UTC m=+22.671026011" watchObservedRunningTime="2026-04-22 17:53:22.373883577 +0000 UTC m=+24.495993757" Apr 22 17:53:22.542867 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.542809 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 17:53:22.543639 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.543240 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerStarted","Data":"8ea4bcaa5cb4f4518a4124736f47d801c533581b344b90375845f65d4871d75f"} Apr 22 17:53:22.543639 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.543592 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:53:22.543804 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.543785 2574 scope.go:117] "RemoveContainer" containerID="de986b50cc64445a6df7ca29afcbf5f06311419b412d55c5afc60db9a79ce7e9" Apr 22 17:53:22.545070 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.544661 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8jbmf" event={"ID":"0e63511d-3bbf-4d8b-bff0-7c1cc73c694c","Type":"ContainerStarted","Data":"27bdd283a72ab58fdd98084642caf64174178d7133545ecec21cabad34f39c67"} Apr 22 17:53:22.545277 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.545059 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:53:22.545750 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.545718 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mp6ht" Apr 22 17:53:22.562942 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:22.562921 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:53:23.363963 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.363931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:23.364122 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:23.364054 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:23.364122 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.364113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:23.364234 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:23.364214 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:23.548956 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.548891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 17:53:23.549361 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.549228 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" event={"ID":"b77e864f-03d8-422c-8cfd-a0a44bbce6e2","Type":"ContainerStarted","Data":"e67b81818b1ad937b88cb5adcd08945407199fd4928bc0c366c14d1774e911a0"} Apr 22 17:53:23.549361 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.549354 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:53:23.549473 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.549458 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:53:23.550872 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.550847 2574 generic.go:358] "Generic (PLEG): container finished" podID="3848ad83-45cd-43a9-8803-52cd31ab6f05" containerID="813fd898ab9f09d2291cb27b52dbbab88909886812fc3d5b1a85825a649d5428" exitCode=0 Apr 22 17:53:23.550991 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.550923 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v2plq" event={"ID":"3848ad83-45cd-43a9-8803-52cd31ab6f05","Type":"ContainerDied","Data":"813fd898ab9f09d2291cb27b52dbbab88909886812fc3d5b1a85825a649d5428"} Apr 22 17:53:23.564027 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.564008 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:53:23.581446 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.580375 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8jbmf" podStartSLOduration=3.580359936 podStartE2EDuration="3.580359936s" podCreationTimestamp="2026-04-22 17:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:22.65964261 +0000 UTC m=+24.781752787" watchObservedRunningTime="2026-04-22 17:53:23.580359936 +0000 UTC m=+25.702470112" Apr 22 17:53:23.604839 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:23.604786 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" podStartSLOduration=7.934990277 podStartE2EDuration="25.604767627s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:52:59.609526264 +0000 UTC m=+1.731636417" lastFinishedPulling="2026-04-22 17:53:17.27930361 +0000 UTC m=+19.401413767" observedRunningTime="2026-04-22 17:53:23.581728101 +0000 UTC m=+25.703838256" watchObservedRunningTime="2026-04-22 17:53:23.604767627 +0000 UTC m=+25.726877804" Apr 22 17:53:24.358603 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:24.358571 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wvndp"] Apr 22 17:53:24.358713 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:24.358704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:24.358816 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:24.358797 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:24.359305 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:24.359285 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgdxx"] Apr 22 17:53:24.359423 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:24.359398 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:24.359489 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:24.359472 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:24.360572 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:24.360546 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sbf9k"] Apr 22 17:53:24.360649 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:24.360627 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:24.360713 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:24.360690 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:24.553989 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:24.553905 2574 generic.go:358] "Generic (PLEG): container finished" podID="3848ad83-45cd-43a9-8803-52cd31ab6f05" containerID="71da3be700010d99a7b11849d20add069b23e4cef0290030be00912003c9e994" exitCode=0 Apr 22 17:53:24.554452 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:24.553996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v2plq" event={"ID":"3848ad83-45cd-43a9-8803-52cd31ab6f05","Type":"ContainerDied","Data":"71da3be700010d99a7b11849d20add069b23e4cef0290030be00912003c9e994"} Apr 22 17:53:24.554452 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:24.554216 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:53:25.557279 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:25.557198 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v2plq" event={"ID":"3848ad83-45cd-43a9-8803-52cd31ab6f05","Type":"ContainerDied","Data":"fa6cb27212eb971b9036fc0b12b50b0febbb83c39f448e6049ce541cac158d67"} Apr 22 17:53:25.557698 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:25.557115 2574 generic.go:358] "Generic (PLEG): container finished" podID="3848ad83-45cd-43a9-8803-52cd31ab6f05" containerID="fa6cb27212eb971b9036fc0b12b50b0febbb83c39f448e6049ce541cac158d67" exitCode=0 Apr 22 17:53:25.557898 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:25.557879 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:53:26.363795 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:26.363705 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:26.363938 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:26.363791 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:26.363938 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:26.363889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:26.363938 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:26.363921 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:26.364058 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:26.364039 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:26.364193 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:26.364161 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:28.364265 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:28.364188 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:28.364685 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:28.364288 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:28.364685 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:28.364343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:28.364685 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:28.364367 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wvndp" podUID="e29345c1-1d57-44eb-aacb-0f51d483baf0" Apr 22 17:53:28.364685 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:28.364430 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:53:28.364685 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:28.364488 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sbf9k" podUID="02186aeb-7b19-4faf-885c-e060403bd058" Apr 22 17:53:29.161365 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:29.161132 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:53:29.161619 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:29.161602 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 17:53:29.182206 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:29.182160 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qv8jk" Apr 22 17:53:30.224595 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.224560 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-143.ec2.internal" event="NodeReady" Apr 22 17:53:30.225038 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.224712 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:53:30.266404 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.266375 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn"] Apr 22 17:53:30.293058 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.293031 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr"] Apr 22 17:53:30.293235 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.293214 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.296857 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.296652 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 17:53:30.296857 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.296663 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 17:53:30.296857 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.296757 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 17:53:30.296857 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.296853 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 17:53:30.308084 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.307821 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86d85b6d7b-8hvcf"] Apr 22 17:53:30.308525 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.308226 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" Apr 22 17:53:30.311546 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.311525 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 17:53:30.311649 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.311525 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-nlwz8\"" Apr 22 17:53:30.323528 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.323474 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll"] Apr 22 17:53:30.323608 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.323590 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.325974 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.325954 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:53:30.325974 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.325969 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gd422\"" Apr 22 17:53:30.326116 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.325971 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:53:30.326283 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.326270 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:53:30.331585 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.331563 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:53:30.343371 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.343349 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k7jtj"] Apr 22 17:53:30.343517 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.343502 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.346210 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.346163 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 17:53:30.346347 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.346268 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 17:53:30.346347 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.346292 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 17:53:30.346471 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.346393 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 17:53:30.361549 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.361525 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr"] Apr 22 17:53:30.361549 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.361553 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll"] Apr 22 17:53:30.361684 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.361568 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86d85b6d7b-8hvcf"] Apr 22 17:53:30.361684 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.361582 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn"] Apr 22 17:53:30.361684 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.361597 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b965g"] Apr 22 17:53:30.361815 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.361693 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.364061 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.364042 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:53:30.364257 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.364240 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5nxd\"" Apr 22 17:53:30.364374 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.364248 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:53:30.380619 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.380597 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b965g"] Apr 22 17:53:30.380782 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.380764 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:30.380852 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.380800 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:30.380852 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.380764 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:30.381012 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.380995 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:30.384106 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.384079 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:53:30.384215 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.384138 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:53:30.384215 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.384184 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:53:30.384415 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.384387 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k7jtj"] Apr 22 17:53:30.384508 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.384423 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jtfq9\"" Apr 22 17:53:30.384508 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.384449 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:53:30.384623 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.384552 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:53:30.384623 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.384569 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-frn4n\"" Apr 22 17:53:30.385212 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.385193 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:53:30.385299 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.385239 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:53:30.385299 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.385250 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qsx25\"" Apr 22 17:53:30.418605 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.418583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/61539d96-f006-4d1d-8dde-31a2e649c96c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr\" (UID: \"61539d96-f006-4d1d-8dde-31a2e649c96c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" Apr 22 17:53:30.418719 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.418621 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-image-registry-private-configuration\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.418719 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.418648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-registry-certificates\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.418871 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.418763 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-installation-pull-secrets\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.418871 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.418800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.418966 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.418880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-trusted-ca\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.418966 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.418912 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fbc902ec-79f1-450f-918d-2c5192ae8a01-klusterlet-config\") pod \"klusterlet-addon-workmgr-68b4bd444b-dbsxn\" (UID: \"fbc902ec-79f1-450f-918d-2c5192ae8a01\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.418966 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.418946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lhzk\" (UniqueName: \"kubernetes.io/projected/61539d96-f006-4d1d-8dde-31a2e649c96c-kube-api-access-8lhzk\") pod \"managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr\" (UID: \"61539d96-f006-4d1d-8dde-31a2e649c96c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" Apr 22 17:53:30.419072 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.418973 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.419072 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419001 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-bound-sa-token\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.419072 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419032 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb8nt\" (UniqueName: \"kubernetes.io/projected/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-kube-api-access-sb8nt\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.419072 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419061 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-ca\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.419243 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk82s\" (UniqueName: \"kubernetes.io/projected/fbc902ec-79f1-450f-918d-2c5192ae8a01-kube-api-access-xk82s\") pod \"klusterlet-addon-workmgr-68b4bd444b-dbsxn\" (UID: \"fbc902ec-79f1-450f-918d-2c5192ae8a01\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.419243 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d83afce0-9b9d-44a7-b334-d993784008e8-ca-trust-extracted\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.419243 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbc902ec-79f1-450f-918d-2c5192ae8a01-tmp\") pod \"klusterlet-addon-workmgr-68b4bd444b-dbsxn\" (UID: \"fbc902ec-79f1-450f-918d-2c5192ae8a01\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.419243 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419209 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.419399 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419247 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.419399 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qlzs\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-kube-api-access-8qlzs\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.419399 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.419283 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-hub\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.519770 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.519736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-installation-pull-secrets\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.519925 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.519778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.519925 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.519807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f672956a-1ca0-441f-a859-2d7cf55a77c5-config-volume\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.519925 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.519829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrt5c\" (UniqueName: \"kubernetes.io/projected/7747539e-8e37-4640-80ce-2863534d185a-kube-api-access-vrt5c\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:30.519925 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.519866 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-trusted-ca\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.519925 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.519889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fbc902ec-79f1-450f-918d-2c5192ae8a01-klusterlet-config\") pod \"klusterlet-addon-workmgr-68b4bd444b-dbsxn\" (UID: \"fbc902ec-79f1-450f-918d-2c5192ae8a01\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.519925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lhzk\" (UniqueName: \"kubernetes.io/projected/61539d96-f006-4d1d-8dde-31a2e649c96c-kube-api-access-8lhzk\") pod \"managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr\" (UID: \"61539d96-f006-4d1d-8dde-31a2e649c96c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.519952 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.519977 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f672956a-1ca0-441f-a859-2d7cf55a77c5-tmp-dir\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-bound-sa-token\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb8nt\" (UniqueName: \"kubernetes.io/projected/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-kube-api-access-sb8nt\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-ca\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520089 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk82s\" (UniqueName: \"kubernetes.io/projected/fbc902ec-79f1-450f-918d-2c5192ae8a01-kube-api-access-xk82s\") pod \"klusterlet-addon-workmgr-68b4bd444b-dbsxn\" (UID: \"fbc902ec-79f1-450f-918d-2c5192ae8a01\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520142 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.520190 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d83afce0-9b9d-44a7-b334-d993784008e8-ca-trust-extracted\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbc902ec-79f1-450f-918d-2c5192ae8a01-tmp\") pod \"klusterlet-addon-workmgr-68b4bd444b-dbsxn\" (UID: \"fbc902ec-79f1-450f-918d-2c5192ae8a01\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520346 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qlzs\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-kube-api-access-8qlzs\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-hub\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/61539d96-f006-4d1d-8dde-31a2e649c96c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr\" (UID: \"61539d96-f006-4d1d-8dde-31a2e649c96c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-image-registry-private-configuration\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-registry-certificates\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j2zs\" (UniqueName: \"kubernetes.io/projected/f672956a-1ca0-441f-a859-2d7cf55a77c5-kube-api-access-6j2zs\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.520844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d83afce0-9b9d-44a7-b334-d993784008e8-ca-trust-extracted\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.521161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fbc902ec-79f1-450f-918d-2c5192ae8a01-tmp\") pod \"klusterlet-addon-workmgr-68b4bd444b-dbsxn\" (UID: \"fbc902ec-79f1-450f-918d-2c5192ae8a01\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.521229 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-trusted-ca\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:30.521261 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:30.521277 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d85b6d7b-8hvcf: secret "image-registry-tls" not found Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:30.521365 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls podName:d83afce0-9b9d-44a7-b334-d993784008e8 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:31.021344755 +0000 UTC m=+33.143454922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls") pod "image-registry-86d85b6d7b-8hvcf" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8") : secret "image-registry-tls" not found Apr 22 17:53:30.522590 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.521822 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.525187 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.525156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-hub\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.525285 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.525202 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/fbc902ec-79f1-450f-918d-2c5192ae8a01-klusterlet-config\") pod \"klusterlet-addon-workmgr-68b4bd444b-dbsxn\" (UID: \"fbc902ec-79f1-450f-918d-2c5192ae8a01\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.525285 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.525232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.525964 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.525806 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-installation-pull-secrets\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.525964 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.525850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/61539d96-f006-4d1d-8dde-31a2e649c96c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr\" (UID: \"61539d96-f006-4d1d-8dde-31a2e649c96c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" Apr 22 17:53:30.526077 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.525976 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.526833 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.526806 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-ca\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.527151 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.527113 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-image-registry-private-configuration\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.527151 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.527148 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-registry-certificates\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.530970 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.530942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-bound-sa-token\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.532433 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.532392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb8nt\" (UniqueName: \"kubernetes.io/projected/2e2eb676-ea90-4f32-bd46-e3f71f3d49aa-kube-api-access-sb8nt\") pod \"cluster-proxy-proxy-agent-5b659ffd86-b9xll\" (UID: \"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:30.532972 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.532949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lhzk\" (UniqueName: \"kubernetes.io/projected/61539d96-f006-4d1d-8dde-31a2e649c96c-kube-api-access-8lhzk\") pod \"managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr\" (UID: \"61539d96-f006-4d1d-8dde-31a2e649c96c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" Apr 22 17:53:30.533103 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.533082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk82s\" (UniqueName: \"kubernetes.io/projected/fbc902ec-79f1-450f-918d-2c5192ae8a01-kube-api-access-xk82s\") pod \"klusterlet-addon-workmgr-68b4bd444b-dbsxn\" (UID: \"fbc902ec-79f1-450f-918d-2c5192ae8a01\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.533373 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.533357 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qlzs\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-kube-api-access-8qlzs\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:30.606545 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.606458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:30.621427 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.621392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j2zs\" (UniqueName: \"kubernetes.io/projected/f672956a-1ca0-441f-a859-2d7cf55a77c5-kube-api-access-6j2zs\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.621565 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.621438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f672956a-1ca0-441f-a859-2d7cf55a77c5-config-volume\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.621565 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.621466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrt5c\" (UniqueName: \"kubernetes.io/projected/7747539e-8e37-4640-80ce-2863534d185a-kube-api-access-vrt5c\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:30.621682 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.621616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f672956a-1ca0-441f-a859-2d7cf55a77c5-tmp-dir\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.621682 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.621666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:30.621780 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:30.621764 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:30.621831 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.621797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.621831 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:30.621813 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert podName:7747539e-8e37-4640-80ce-2863534d185a nodeName:}" failed. No retries permitted until 2026-04-22 17:53:31.121799502 +0000 UTC m=+33.243909657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert") pod "ingress-canary-b965g" (UID: "7747539e-8e37-4640-80ce-2863534d185a") : secret "canary-serving-cert" not found Apr 22 17:53:30.621937 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:30.621907 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:30.621991 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:30.621943 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls podName:f672956a-1ca0-441f-a859-2d7cf55a77c5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:31.121931919 +0000 UTC m=+33.244042072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls") pod "dns-default-k7jtj" (UID: "f672956a-1ca0-441f-a859-2d7cf55a77c5") : secret "dns-default-metrics-tls" not found Apr 22 17:53:30.621991 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.621945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f672956a-1ca0-441f-a859-2d7cf55a77c5-tmp-dir\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.622084 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.622040 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f672956a-1ca0-441f-a859-2d7cf55a77c5-config-volume\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.624308 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.624283 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" Apr 22 17:53:30.630651 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.630633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j2zs\" (UniqueName: \"kubernetes.io/projected/f672956a-1ca0-441f-a859-2d7cf55a77c5-kube-api-access-6j2zs\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:30.630884 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.630869 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrt5c\" (UniqueName: \"kubernetes.io/projected/7747539e-8e37-4640-80ce-2863534d185a-kube-api-access-vrt5c\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:30.653277 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:30.653258 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:53:31.027377 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.026466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:53:31.027377 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.026594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:31.027377 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:31.026740 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:31.027377 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:31.026753 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d85b6d7b-8hvcf: secret "image-registry-tls" not found Apr 22 17:53:31.027377 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:31.026812 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls podName:d83afce0-9b9d-44a7-b334-d993784008e8 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.026794111 +0000 UTC m=+34.148904267 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls") pod "image-registry-86d85b6d7b-8hvcf" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8") : secret "image-registry-tls" not found Apr 22 17:53:31.027377 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:31.026823 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:53:31.027377 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:31.026899 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs podName:8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c nodeName:}" failed. No retries permitted until 2026-04-22 17:54:03.026881604 +0000 UTC m=+65.148991773 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs") pod "network-metrics-daemon-hgdxx" (UID: "8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c") : secret "metrics-daemon-secret" not found Apr 22 17:53:31.097777 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.097742 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll"] Apr 22 17:53:31.100142 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.100123 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn"] Apr 22 17:53:31.104471 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.104449 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr"] Apr 22 17:53:31.127627 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.127608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:31.127723 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.127637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:31.127723 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.127664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p92hv\" (UniqueName: \"kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv\") pod \"network-check-target-sbf9k\" (UID: \"02186aeb-7b19-4faf-885c-e060403bd058\") " pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:31.127794 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:31.127774 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:31.127845 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:31.127835 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert podName:7747539e-8e37-4640-80ce-2863534d185a nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.127820949 +0000 UTC m=+34.249931102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert") pod "ingress-canary-b965g" (UID: "7747539e-8e37-4640-80ce-2863534d185a") : secret "canary-serving-cert" not found Apr 22 17:53:31.127884 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:31.127856 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:31.127923 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:31.127914 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls podName:f672956a-1ca0-441f-a859-2d7cf55a77c5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.127890986 +0000 UTC m=+34.250001142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls") pod "dns-default-k7jtj" (UID: "f672956a-1ca0-441f-a859-2d7cf55a77c5") : secret "dns-default-metrics-tls" not found Apr 22 17:53:31.130076 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.130052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p92hv\" (UniqueName: \"kubernetes.io/projected/02186aeb-7b19-4faf-885c-e060403bd058-kube-api-access-p92hv\") pod \"network-check-target-sbf9k\" (UID: \"02186aeb-7b19-4faf-885c-e060403bd058\") " pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:31.219571 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:53:31.219507 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e2eb676_ea90_4f32_bd46_e3f71f3d49aa.slice/crio-acd17572e75de80d7658cc45472387961d7d268294ac7203a0bc714f1a47aec7 WatchSource:0}: Error finding container acd17572e75de80d7658cc45472387961d7d268294ac7203a0bc714f1a47aec7: Status 404 returned error can't find the container with id acd17572e75de80d7658cc45472387961d7d268294ac7203a0bc714f1a47aec7 Apr 22 17:53:31.219758 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:53:31.219739 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbc902ec_79f1_450f_918d_2c5192ae8a01.slice/crio-7a7a2acd6135dba99866a08168b7852f18b210c18ed78b88f4220667460df530 WatchSource:0}: Error finding container 7a7a2acd6135dba99866a08168b7852f18b210c18ed78b88f4220667460df530: Status 404 returned error can't find the container with id 7a7a2acd6135dba99866a08168b7852f18b210c18ed78b88f4220667460df530 Apr 22 17:53:31.220251 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:53:31.220229 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61539d96_f006_4d1d_8dde_31a2e649c96c.slice/crio-f9cbad75389b73e67f1d4a922c07496740de023d9b46cc5f0885f44e6b9dec1e WatchSource:0}: Error finding container f9cbad75389b73e67f1d4a922c07496740de023d9b46cc5f0885f44e6b9dec1e: Status 404 returned error can't find the container with id f9cbad75389b73e67f1d4a922c07496740de023d9b46cc5f0885f44e6b9dec1e Apr 22 17:53:31.307779 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.307752 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:31.431689 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.431522 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sbf9k"] Apr 22 17:53:31.435423 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:53:31.435396 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02186aeb_7b19_4faf_885c_e060403bd058.slice/crio-e3edf4b136ecd6843a25967be9a26582adfcab9a43f4076085be3ce6d1e6e5a9 WatchSource:0}: Error finding container e3edf4b136ecd6843a25967be9a26582adfcab9a43f4076085be3ce6d1e6e5a9: Status 404 returned error can't find the container with id e3edf4b136ecd6843a25967be9a26582adfcab9a43f4076085be3ce6d1e6e5a9 Apr 22 17:53:31.570030 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.569997 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" event={"ID":"61539d96-f006-4d1d-8dde-31a2e649c96c","Type":"ContainerStarted","Data":"f9cbad75389b73e67f1d4a922c07496740de023d9b46cc5f0885f44e6b9dec1e"} Apr 22 17:53:31.570936 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.570914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" event={"ID":"fbc902ec-79f1-450f-918d-2c5192ae8a01","Type":"ContainerStarted","Data":"7a7a2acd6135dba99866a08168b7852f18b210c18ed78b88f4220667460df530"} Apr 22 17:53:31.571848 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.571827 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" event={"ID":"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa","Type":"ContainerStarted","Data":"acd17572e75de80d7658cc45472387961d7d268294ac7203a0bc714f1a47aec7"} Apr 22 17:53:31.574077 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.574056 2574 generic.go:358] "Generic (PLEG): container finished" podID="3848ad83-45cd-43a9-8803-52cd31ab6f05" containerID="9938f8af61e67f79f0466e4c81354d1d68f1cb8b3f0b4fd137aa051f591ac23a" exitCode=0 Apr 22 17:53:31.574168 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.574112 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v2plq" event={"ID":"3848ad83-45cd-43a9-8803-52cd31ab6f05","Type":"ContainerDied","Data":"9938f8af61e67f79f0466e4c81354d1d68f1cb8b3f0b4fd137aa051f591ac23a"} Apr 22 17:53:31.575215 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:31.575197 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sbf9k" event={"ID":"02186aeb-7b19-4faf-885c-e060403bd058","Type":"ContainerStarted","Data":"e3edf4b136ecd6843a25967be9a26582adfcab9a43f4076085be3ce6d1e6e5a9"} Apr 22 17:53:32.035548 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:32.035468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:32.035705 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:32.035631 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:32.035705 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:32.035653 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d85b6d7b-8hvcf: secret "image-registry-tls" not found Apr 22 17:53:32.035816 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:32.035718 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls podName:d83afce0-9b9d-44a7-b334-d993784008e8 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:34.035696584 +0000 UTC m=+36.157806752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls") pod "image-registry-86d85b6d7b-8hvcf" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8") : secret "image-registry-tls" not found Apr 22 17:53:32.136308 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:32.136271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:32.136492 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:32.136320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:32.136555 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:32.136539 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:32.136623 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:32.136604 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls podName:f672956a-1ca0-441f-a859-2d7cf55a77c5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:34.136584758 +0000 UTC m=+36.258694914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls") pod "dns-default-k7jtj" (UID: "f672956a-1ca0-441f-a859-2d7cf55a77c5") : secret "dns-default-metrics-tls" not found Apr 22 17:53:32.137079 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:32.137018 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:32.137079 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:32.137074 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert podName:7747539e-8e37-4640-80ce-2863534d185a nodeName:}" failed. No retries permitted until 2026-04-22 17:53:34.137058423 +0000 UTC m=+36.259168579 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert") pod "ingress-canary-b965g" (UID: "7747539e-8e37-4640-80ce-2863534d185a") : secret "canary-serving-cert" not found Apr 22 17:53:32.585275 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:32.585237 2574 generic.go:358] "Generic (PLEG): container finished" podID="3848ad83-45cd-43a9-8803-52cd31ab6f05" containerID="f80acf6527537dc08b6d3140ae1bbacf10a3bc24efc0c720e6a9f6927c6cec7f" exitCode=0 Apr 22 17:53:32.585827 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:32.585312 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v2plq" event={"ID":"3848ad83-45cd-43a9-8803-52cd31ab6f05","Type":"ContainerDied","Data":"f80acf6527537dc08b6d3140ae1bbacf10a3bc24efc0c720e6a9f6927c6cec7f"} Apr 22 17:53:33.595684 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:33.595645 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v2plq" event={"ID":"3848ad83-45cd-43a9-8803-52cd31ab6f05","Type":"ContainerStarted","Data":"75fb7b8e1b33ea2817d41178beac763a7ede915000c252f96b15af77bd12fafc"} Apr 22 17:53:33.625111 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:33.624507 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-v2plq" podStartSLOduration=3.964228445 podStartE2EDuration="35.624486156s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:52:59.59739446 +0000 UTC m=+1.719504613" lastFinishedPulling="2026-04-22 17:53:31.257652167 +0000 UTC m=+33.379762324" observedRunningTime="2026-04-22 17:53:33.623092411 +0000 UTC m=+35.745202585" watchObservedRunningTime="2026-04-22 17:53:33.624486156 +0000 UTC m=+35.746596342" Apr 22 17:53:34.054459 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:34.054418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:34.054682 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:34.054646 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:34.054682 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:34.054662 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d85b6d7b-8hvcf: secret "image-registry-tls" not found Apr 22 17:53:34.054792 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:34.054723 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls podName:d83afce0-9b9d-44a7-b334-d993784008e8 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:38.054704129 +0000 UTC m=+40.176814297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls") pod "image-registry-86d85b6d7b-8hvcf" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8") : secret "image-registry-tls" not found Apr 22 17:53:34.155741 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:34.155703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:34.155938 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:34.155756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:34.155938 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:34.155884 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:34.156072 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:34.155955 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert podName:7747539e-8e37-4640-80ce-2863534d185a nodeName:}" failed. No retries permitted until 2026-04-22 17:53:38.1559361 +0000 UTC m=+40.278046267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert") pod "ingress-canary-b965g" (UID: "7747539e-8e37-4640-80ce-2863534d185a") : secret "canary-serving-cert" not found Apr 22 17:53:34.156072 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:34.155884 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:34.156182 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:34.156127 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls podName:f672956a-1ca0-441f-a859-2d7cf55a77c5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:38.15611207 +0000 UTC m=+40.278222227 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls") pod "dns-default-k7jtj" (UID: "f672956a-1ca0-441f-a859-2d7cf55a77c5") : secret "dns-default-metrics-tls" not found Apr 22 17:53:37.887952 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:37.887733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:37.891778 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:37.891752 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e29345c1-1d57-44eb-aacb-0f51d483baf0-original-pull-secret\") pod \"global-pull-secret-syncer-wvndp\" (UID: \"e29345c1-1d57-44eb-aacb-0f51d483baf0\") " pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:37.912901 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:37.912871 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wvndp" Apr 22 17:53:38.089869 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:38.089838 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:38.090094 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:38.090007 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:38.090094 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:38.090028 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d85b6d7b-8hvcf: secret "image-registry-tls" not found Apr 22 17:53:38.090094 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:38.090085 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls podName:d83afce0-9b9d-44a7-b334-d993784008e8 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:46.09006767 +0000 UTC m=+48.212177823 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls") pod "image-registry-86d85b6d7b-8hvcf" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8") : secret "image-registry-tls" not found Apr 22 17:53:38.190520 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:38.190456 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:38.190520 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:38.190492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:38.190739 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:38.190593 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:38.190739 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:38.190646 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:38.190739 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:38.190664 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls podName:f672956a-1ca0-441f-a859-2d7cf55a77c5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:46.190643939 +0000 UTC m=+48.312754112 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls") pod "dns-default-k7jtj" (UID: "f672956a-1ca0-441f-a859-2d7cf55a77c5") : secret "dns-default-metrics-tls" not found Apr 22 17:53:38.190839 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:38.190758 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert podName:7747539e-8e37-4640-80ce-2863534d185a nodeName:}" failed. No retries permitted until 2026-04-22 17:53:46.190737689 +0000 UTC m=+48.312847846 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert") pod "ingress-canary-b965g" (UID: "7747539e-8e37-4640-80ce-2863534d185a") : secret "canary-serving-cert" not found Apr 22 17:53:38.702561 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:38.702450 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wvndp"] Apr 22 17:53:38.705693 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:53:38.705666 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode29345c1_1d57_44eb_aacb_0f51d483baf0.slice/crio-fadd6594b7921a4413776d9a95f32200bfa60183d4a67b4943ee0bcd337bf3d1 WatchSource:0}: Error finding container fadd6594b7921a4413776d9a95f32200bfa60183d4a67b4943ee0bcd337bf3d1: Status 404 returned error can't find the container with id fadd6594b7921a4413776d9a95f32200bfa60183d4a67b4943ee0bcd337bf3d1 Apr 22 17:53:39.609865 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.609792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" event={"ID":"61539d96-f006-4d1d-8dde-31a2e649c96c","Type":"ContainerStarted","Data":"e58392bfbd77ca76f25a2b69bd8a1617985365854aa1cc86387b33db6093e8a9"} Apr 22 17:53:39.612167 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.612131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" event={"ID":"fbc902ec-79f1-450f-918d-2c5192ae8a01","Type":"ContainerStarted","Data":"0af8083986d394bdc08416dd517b21b4d0063918398303f2f1f4ac2c18ffa332"} Apr 22 17:53:39.612661 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.612605 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:39.614317 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.614293 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wvndp" event={"ID":"e29345c1-1d57-44eb-aacb-0f51d483baf0","Type":"ContainerStarted","Data":"fadd6594b7921a4413776d9a95f32200bfa60183d4a67b4943ee0bcd337bf3d1"} Apr 22 17:53:39.615426 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.615396 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:53:39.616755 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.616730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" event={"ID":"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa","Type":"ContainerStarted","Data":"04315f7285baeb71a0003167645fc53091b13248ccff6d786c0ca265508593ac"} Apr 22 17:53:39.618298 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.618196 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sbf9k" event={"ID":"02186aeb-7b19-4faf-885c-e060403bd058","Type":"ContainerStarted","Data":"b2ad4a5b91792bec3e80aa629826100618cfaef6f4c96b47c44c586e3bf5195c"} Apr 22 17:53:39.618421 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.618366 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:53:39.656453 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.656347 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" podStartSLOduration=13.306204305 podStartE2EDuration="20.656316573s" podCreationTimestamp="2026-04-22 17:53:19 +0000 UTC" firstStartedPulling="2026-04-22 17:53:31.235229588 +0000 UTC m=+33.357339747" lastFinishedPulling="2026-04-22 17:53:38.585341849 +0000 UTC m=+40.707452015" observedRunningTime="2026-04-22 17:53:39.633421462 +0000 UTC m=+41.755531638" watchObservedRunningTime="2026-04-22 17:53:39.656316573 +0000 UTC m=+41.778426749" Apr 22 17:53:39.656597 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.656484 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" podStartSLOduration=13.292047285 podStartE2EDuration="20.656477117s" podCreationTimestamp="2026-04-22 17:53:19 +0000 UTC" firstStartedPulling="2026-04-22 17:53:31.235089686 +0000 UTC m=+33.357199845" lastFinishedPulling="2026-04-22 17:53:38.599519515 +0000 UTC m=+40.721629677" observedRunningTime="2026-04-22 17:53:39.654748833 +0000 UTC m=+41.776859048" watchObservedRunningTime="2026-04-22 17:53:39.656477117 +0000 UTC m=+41.778587293" Apr 22 17:53:39.682836 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:39.682797 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sbf9k" podStartSLOduration=34.533910527 podStartE2EDuration="41.682781745s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:53:31.437225763 +0000 UTC m=+33.559335917" lastFinishedPulling="2026-04-22 17:53:38.58609698 +0000 UTC m=+40.708207135" observedRunningTime="2026-04-22 17:53:39.681717098 +0000 UTC m=+41.803827276" watchObservedRunningTime="2026-04-22 17:53:39.682781745 +0000 UTC m=+41.804891921" Apr 22 17:53:43.628364 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:43.628313 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wvndp" event={"ID":"e29345c1-1d57-44eb-aacb-0f51d483baf0","Type":"ContainerStarted","Data":"2107a7c9a6ecbe86126b2398a750012b1f14b097fe4cc58b2ae1f44414fa6974"} Apr 22 17:53:43.630116 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:43.630096 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" event={"ID":"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa","Type":"ContainerStarted","Data":"8903071b5e9d622e61a0aa15e3ec562c47e188483e2af325afd232ae6c4c0dd6"} Apr 22 17:53:43.630192 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:43.630123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" event={"ID":"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa","Type":"ContainerStarted","Data":"dd372d11f4a9a182e5cc9b66abe92246f1c744aa701306c2eb3cceed2628b676"} Apr 22 17:53:43.650966 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:43.650926 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wvndp" podStartSLOduration=33.311450251 podStartE2EDuration="37.650914717s" podCreationTimestamp="2026-04-22 17:53:06 +0000 UTC" firstStartedPulling="2026-04-22 17:53:38.707952989 +0000 UTC m=+40.830063159" lastFinishedPulling="2026-04-22 17:53:43.047417468 +0000 UTC m=+45.169527625" observedRunningTime="2026-04-22 17:53:43.64987038 +0000 UTC m=+45.771980557" watchObservedRunningTime="2026-04-22 17:53:43.650914717 +0000 UTC m=+45.773024885" Apr 22 17:53:43.680201 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:43.680164 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" podStartSLOduration=13.272448514 podStartE2EDuration="24.680153955s" podCreationTimestamp="2026-04-22 17:53:19 +0000 UTC" firstStartedPulling="2026-04-22 17:53:31.235144464 +0000 UTC m=+33.357254619" lastFinishedPulling="2026-04-22 17:53:42.642849903 +0000 UTC m=+44.764960060" observedRunningTime="2026-04-22 17:53:43.679594896 +0000 UTC m=+45.801705073" watchObservedRunningTime="2026-04-22 17:53:43.680153955 +0000 UTC m=+45.802264129" Apr 22 17:53:46.150750 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:46.150712 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:53:46.151143 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:46.150830 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:46.151143 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:46.150841 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d85b6d7b-8hvcf: secret "image-registry-tls" not found Apr 22 17:53:46.151143 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:46.150901 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls podName:d83afce0-9b9d-44a7-b334-d993784008e8 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:02.150886401 +0000 UTC m=+64.272996554 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls") pod "image-registry-86d85b6d7b-8hvcf" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8") : secret "image-registry-tls" not found Apr 22 17:53:46.250954 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:46.250924 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:53:46.251102 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:53:46.250958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:53:46.251102 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:46.251086 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:46.251102 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:46.251100 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:46.251230 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:46.251149 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls podName:f672956a-1ca0-441f-a859-2d7cf55a77c5 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:02.251134308 +0000 UTC m=+64.373244462 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls") pod "dns-default-k7jtj" (UID: "f672956a-1ca0-441f-a859-2d7cf55a77c5") : secret "dns-default-metrics-tls" not found Apr 22 17:53:46.251230 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:53:46.251163 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert podName:7747539e-8e37-4640-80ce-2863534d185a nodeName:}" failed. No retries permitted until 2026-04-22 17:54:02.25115709 +0000 UTC m=+64.373267243 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert") pod "ingress-canary-b965g" (UID: "7747539e-8e37-4640-80ce-2863534d185a") : secret "canary-serving-cert" not found Apr 22 17:54:02.159693 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:54:02.159657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:54:02.160158 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:02.159813 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:54:02.160158 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:02.159827 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d85b6d7b-8hvcf: secret "image-registry-tls" not found Apr 22 17:54:02.160158 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:02.159902 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls podName:d83afce0-9b9d-44a7-b334-d993784008e8 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:34.159878371 +0000 UTC m=+96.281988525 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls") pod "image-registry-86d85b6d7b-8hvcf" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8") : secret "image-registry-tls" not found Apr 22 17:54:02.260495 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:54:02.260462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:54:02.260495 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:54:02.260497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:54:02.260743 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:02.260620 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:02.260743 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:02.260625 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:02.260743 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:02.260668 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls podName:f672956a-1ca0-441f-a859-2d7cf55a77c5 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:34.260654607 +0000 UTC m=+96.382764761 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls") pod "dns-default-k7jtj" (UID: "f672956a-1ca0-441f-a859-2d7cf55a77c5") : secret "dns-default-metrics-tls" not found Apr 22 17:54:02.260743 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:02.260705 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert podName:7747539e-8e37-4640-80ce-2863534d185a nodeName:}" failed. No retries permitted until 2026-04-22 17:54:34.260686895 +0000 UTC m=+96.382797048 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert") pod "ingress-canary-b965g" (UID: "7747539e-8e37-4640-80ce-2863534d185a") : secret "canary-serving-cert" not found Apr 22 17:54:03.066369 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:54:03.066320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:54:03.066526 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:03.066439 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:54:03.066526 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:03.066511 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs podName:8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c nodeName:}" failed. No retries permitted until 2026-04-22 17:55:07.066494634 +0000 UTC m=+129.188604808 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs") pod "network-metrics-daemon-hgdxx" (UID: "8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c") : secret "metrics-daemon-secret" not found Apr 22 17:54:10.623028 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:54:10.622999 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sbf9k" Apr 22 17:54:34.179530 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:54:34.179452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:54:34.179953 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:34.179556 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:54:34.179953 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:34.179567 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d85b6d7b-8hvcf: secret "image-registry-tls" not found Apr 22 17:54:34.179953 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:34.179627 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls podName:d83afce0-9b9d-44a7-b334-d993784008e8 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:38.179612287 +0000 UTC m=+160.301722439 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls") pod "image-registry-86d85b6d7b-8hvcf" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8") : secret "image-registry-tls" not found Apr 22 17:54:34.280597 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:54:34.280569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:54:34.280765 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:54:34.280740 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:54:34.280833 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:34.280749 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:34.280833 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:34.280797 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:34.280833 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:34.280831 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls podName:f672956a-1ca0-441f-a859-2d7cf55a77c5 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:38.280815735 +0000 UTC m=+160.402925889 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls") pod "dns-default-k7jtj" (UID: "f672956a-1ca0-441f-a859-2d7cf55a77c5") : secret "dns-default-metrics-tls" not found Apr 22 17:54:34.280962 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:54:34.280868 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert podName:7747539e-8e37-4640-80ce-2863534d185a nodeName:}" failed. No retries permitted until 2026-04-22 17:55:38.280849417 +0000 UTC m=+160.402959596 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert") pod "ingress-canary-b965g" (UID: "7747539e-8e37-4640-80ce-2863534d185a") : secret "canary-serving-cert" not found Apr 22 17:55:07.118478 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:07.118433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:55:07.119070 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:07.118606 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:55:07.119070 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:07.118720 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs podName:8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c nodeName:}" failed. No retries permitted until 2026-04-22 17:57:09.11869929 +0000 UTC m=+251.240809443 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs") pod "network-metrics-daemon-hgdxx" (UID: "8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c") : secret "metrics-daemon-secret" not found Apr 22 17:55:33.334208 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:33.334160 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" podUID="d83afce0-9b9d-44a7-b334-d993784008e8" Apr 22 17:55:33.379051 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:33.379017 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-k7jtj" podUID="f672956a-1ca0-441f-a859-2d7cf55a77c5" Apr 22 17:55:33.392179 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:33.392152 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b965g" podUID="7747539e-8e37-4640-80ce-2863534d185a" Apr 22 17:55:33.398472 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:33.398448 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-hgdxx" podUID="8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c" Apr 22 17:55:33.881043 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:33.881009 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:55:33.881211 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:33.881009 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k7jtj" Apr 22 17:55:38.245681 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.245645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") pod \"image-registry-86d85b6d7b-8hvcf\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:55:38.246162 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:38.245809 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:55:38.246162 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:38.245832 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86d85b6d7b-8hvcf: secret "image-registry-tls" not found Apr 22 17:55:38.246162 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:38.245895 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls podName:d83afce0-9b9d-44a7-b334-d993784008e8 nodeName:}" failed. No retries permitted until 2026-04-22 17:57:40.245875108 +0000 UTC m=+282.367985265 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls") pod "image-registry-86d85b6d7b-8hvcf" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8") : secret "image-registry-tls" not found Apr 22 17:55:38.345999 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.345962 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:55:38.346184 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.346012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:55:38.346184 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:38.346113 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:55:38.346184 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:38.346173 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls podName:f672956a-1ca0-441f-a859-2d7cf55a77c5 nodeName:}" failed. No retries permitted until 2026-04-22 17:57:40.34615944 +0000 UTC m=+282.468269593 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls") pod "dns-default-k7jtj" (UID: "f672956a-1ca0-441f-a859-2d7cf55a77c5") : secret "dns-default-metrics-tls" not found Apr 22 17:55:38.346522 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:38.346114 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:55:38.346522 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:55:38.346244 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert podName:7747539e-8e37-4640-80ce-2863534d185a nodeName:}" failed. No retries permitted until 2026-04-22 17:57:40.346230503 +0000 UTC m=+282.468340670 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert") pod "ingress-canary-b965g" (UID: "7747539e-8e37-4640-80ce-2863534d185a") : secret "canary-serving-cert" not found Apr 22 17:55:38.896535 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.896504 2574 generic.go:358] "Generic (PLEG): container finished" podID="61539d96-f006-4d1d-8dde-31a2e649c96c" containerID="e58392bfbd77ca76f25a2b69bd8a1617985365854aa1cc86387b33db6093e8a9" exitCode=255 Apr 22 17:55:38.896670 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.896577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" event={"ID":"61539d96-f006-4d1d-8dde-31a2e649c96c","Type":"ContainerDied","Data":"e58392bfbd77ca76f25a2b69bd8a1617985365854aa1cc86387b33db6093e8a9"} Apr 22 17:55:38.896909 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.896894 2574 scope.go:117] "RemoveContainer" containerID="e58392bfbd77ca76f25a2b69bd8a1617985365854aa1cc86387b33db6093e8a9" Apr 22 17:55:38.897817 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.897800 2574 generic.go:358] "Generic (PLEG): container finished" podID="fbc902ec-79f1-450f-918d-2c5192ae8a01" containerID="0af8083986d394bdc08416dd517b21b4d0063918398303f2f1f4ac2c18ffa332" exitCode=1 Apr 22 17:55:38.897891 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.897844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" event={"ID":"fbc902ec-79f1-450f-918d-2c5192ae8a01","Type":"ContainerDied","Data":"0af8083986d394bdc08416dd517b21b4d0063918398303f2f1f4ac2c18ffa332"} Apr 22 17:55:38.898097 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.898078 2574 scope.go:117] "RemoveContainer" containerID="0af8083986d394bdc08416dd517b21b4d0063918398303f2f1f4ac2c18ffa332" Apr 22 17:55:38.951576 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:38.951558 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8jbmf_0e63511d-3bbf-4d8b-bff0-7c1cc73c694c/dns-node-resolver/0.log" Apr 22 17:55:39.613072 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:39.613038 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:55:39.901535 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:39.901502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c5954bdd5-wpjlr" event={"ID":"61539d96-f006-4d1d-8dde-31a2e649c96c","Type":"ContainerStarted","Data":"0577d376a01c3f8c5021fb7fa8310c603ed88a08bfb1f7937a071dff5f901a16"} Apr 22 17:55:39.903003 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:39.902977 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" event={"ID":"fbc902ec-79f1-450f-918d-2c5192ae8a01","Type":"ContainerStarted","Data":"3232441b878029faa37d659b3839777425871af1a763945adab669dcbe4effe8"} Apr 22 17:55:39.903184 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:39.903169 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:55:39.903752 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:39.903736 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68b4bd444b-dbsxn" Apr 22 17:55:40.347361 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:40.347282 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wl6lv_f8bb80bf-d436-4bad-a3bf-b26dcc359766/node-ca/0.log" Apr 22 17:55:44.363846 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:44.363807 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:55:44.364386 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:55:44.363807 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:56:03.134540 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.134468 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tw8jw"] Apr 22 17:56:03.137502 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.137486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.140178 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.140157 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:56:03.141490 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.141471 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4dkvz\"" Apr 22 17:56:03.141594 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.141536 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:56:03.141594 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.141536 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:56:03.141594 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.141583 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:56:03.152036 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.152009 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tw8jw"] Apr 22 17:56:03.219462 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.219435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-data-volume\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.219462 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.219466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m2nl\" (UniqueName: \"kubernetes.io/projected/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-kube-api-access-9m2nl\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.219610 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.219514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.219610 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.219566 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.219610 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.219602 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-crio-socket\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.320774 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.320748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.320876 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.320777 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.320876 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.320804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-crio-socket\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.320952 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.320928 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-data-volume\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.320997 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.320953 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9m2nl\" (UniqueName: \"kubernetes.io/projected/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-kube-api-access-9m2nl\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.321035 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.321015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-crio-socket\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.321234 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.321212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-data-volume\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.321297 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.321283 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.323006 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.322989 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.345988 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.345964 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m2nl\" (UniqueName: \"kubernetes.io/projected/015ddcfd-46c1-460a-bfbe-a6cbe3eed75e-kube-api-access-9m2nl\") pod \"insights-runtime-extractor-tw8jw\" (UID: \"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e\") " pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.446161 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.446083 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tw8jw" Apr 22 17:56:03.566381 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.566356 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tw8jw"] Apr 22 17:56:03.569364 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:56:03.569321 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod015ddcfd_46c1_460a_bfbe_a6cbe3eed75e.slice/crio-8476117ce520c81663569d5120bec77e9a423bcf49cb6dcb06137aaea9d392c9 WatchSource:0}: Error finding container 8476117ce520c81663569d5120bec77e9a423bcf49cb6dcb06137aaea9d392c9: Status 404 returned error can't find the container with id 8476117ce520c81663569d5120bec77e9a423bcf49cb6dcb06137aaea9d392c9 Apr 22 17:56:03.963632 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.963596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tw8jw" event={"ID":"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e","Type":"ContainerStarted","Data":"a6c03c43210f4a0f972a208f3a8d8ac20d1d4e565380d5b0314df28ffe4fe46e"} Apr 22 17:56:03.963632 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:03.963637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tw8jw" event={"ID":"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e","Type":"ContainerStarted","Data":"8476117ce520c81663569d5120bec77e9a423bcf49cb6dcb06137aaea9d392c9"} Apr 22 17:56:04.968115 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:04.968076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tw8jw" event={"ID":"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e","Type":"ContainerStarted","Data":"cadf59a29a25c9b735dcefda449646f7c1acfc1a1e43d16a79b10cb8b4042cf4"} Apr 22 17:56:05.973660 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:05.972506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tw8jw" event={"ID":"015ddcfd-46c1-460a-bfbe-a6cbe3eed75e","Type":"ContainerStarted","Data":"0180a590d37823c05a2c4b5a13e614dee6c6d4ae3998357fe781e3766ff482bd"} Apr 22 17:56:05.992134 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:05.992089 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tw8jw" podStartSLOduration=0.86345334 podStartE2EDuration="2.992075278s" podCreationTimestamp="2026-04-22 17:56:03 +0000 UTC" firstStartedPulling="2026-04-22 17:56:03.624113112 +0000 UTC m=+185.746223265" lastFinishedPulling="2026-04-22 17:56:05.752735045 +0000 UTC m=+187.874845203" observedRunningTime="2026-04-22 17:56:05.990170348 +0000 UTC m=+188.112280527" watchObservedRunningTime="2026-04-22 17:56:05.992075278 +0000 UTC m=+188.114185452" Apr 22 17:56:15.224743 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.224709 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zp6sd"] Apr 22 17:56:15.227947 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.227928 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.230560 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.230542 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:56:15.230656 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.230578 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:56:15.231545 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.231529 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:56:15.231837 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.231812 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:56:15.231930 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.231814 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lp8xv\"" Apr 22 17:56:15.231930 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.231914 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:56:15.232027 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.231962 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:56:15.318718 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.318682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-sys\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.318845 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.318734 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-accelerators-collector-config\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.318845 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.318760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-textfile\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.318845 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.318807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.318960 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.318857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-tls\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.318960 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.318884 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcjx4\" (UniqueName: \"kubernetes.io/projected/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-kube-api-access-kcjx4\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.318960 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.318906 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-wtmp\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.318960 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.318925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-metrics-client-ca\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.319155 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.318980 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-root\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420253 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420219 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-sys\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420386 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-accelerators-collector-config\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420386 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420320 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-sys\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420386 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-textfile\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420491 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420491 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420456 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-tls\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420554 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420487 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcjx4\" (UniqueName: \"kubernetes.io/projected/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-kube-api-access-kcjx4\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420554 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-wtmp\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420554 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-metrics-client-ca\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420659 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-root\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420740 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-wtmp\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420740 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:56:15.420722 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 17:56:15.420852 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:56:15.420785 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-tls podName:075f4a8c-dce4-4d3f-bcbe-b60f503daedc nodeName:}" failed. No retries permitted until 2026-04-22 17:56:15.920765275 +0000 UTC m=+198.042875430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-tls") pod "node-exporter-zp6sd" (UID: "075f4a8c-dce4-4d3f-bcbe-b60f503daedc") : secret "node-exporter-tls" not found Apr 22 17:56:15.420852 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-textfile\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420852 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-accelerators-collector-config\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.420852 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-root\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.421053 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.420996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-metrics-client-ca\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.422598 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.422582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.429426 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.429409 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcjx4\" (UniqueName: \"kubernetes.io/projected/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-kube-api-access-kcjx4\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.923710 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.923669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-tls\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:15.925972 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:15.925943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/075f4a8c-dce4-4d3f-bcbe-b60f503daedc-node-exporter-tls\") pod \"node-exporter-zp6sd\" (UID: \"075f4a8c-dce4-4d3f-bcbe-b60f503daedc\") " pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:16.136486 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:16.136455 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zp6sd" Apr 22 17:56:16.144007 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:56:16.143976 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod075f4a8c_dce4_4d3f_bcbe_b60f503daedc.slice/crio-2e25efa812fa985e9e4a0e4729d1bfe032e7df93da57f43959d8b1f69790b40d WatchSource:0}: Error finding container 2e25efa812fa985e9e4a0e4729d1bfe032e7df93da57f43959d8b1f69790b40d: Status 404 returned error can't find the container with id 2e25efa812fa985e9e4a0e4729d1bfe032e7df93da57f43959d8b1f69790b40d Apr 22 17:56:16.998796 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:16.998761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zp6sd" event={"ID":"075f4a8c-dce4-4d3f-bcbe-b60f503daedc","Type":"ContainerStarted","Data":"0286bc5f51f97aaa8209dd3de70c0a4b000190f6068c654924cb480bf79d46a0"} Apr 22 17:56:16.998796 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:16.998800 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zp6sd" event={"ID":"075f4a8c-dce4-4d3f-bcbe-b60f503daedc","Type":"ContainerStarted","Data":"2e25efa812fa985e9e4a0e4729d1bfe032e7df93da57f43959d8b1f69790b40d"} Apr 22 17:56:18.002050 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:18.002017 2574 generic.go:358] "Generic (PLEG): container finished" podID="075f4a8c-dce4-4d3f-bcbe-b60f503daedc" containerID="0286bc5f51f97aaa8209dd3de70c0a4b000190f6068c654924cb480bf79d46a0" exitCode=0 Apr 22 17:56:18.002439 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:18.002076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zp6sd" event={"ID":"075f4a8c-dce4-4d3f-bcbe-b60f503daedc","Type":"ContainerDied","Data":"0286bc5f51f97aaa8209dd3de70c0a4b000190f6068c654924cb480bf79d46a0"} Apr 22 17:56:19.006284 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:19.006250 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zp6sd" event={"ID":"075f4a8c-dce4-4d3f-bcbe-b60f503daedc","Type":"ContainerStarted","Data":"7a202ecd8b100d9ba5c0605de072950f9cb3b3473d2c272426e118c8dafc877f"} Apr 22 17:56:19.006284 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:19.006286 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zp6sd" event={"ID":"075f4a8c-dce4-4d3f-bcbe-b60f503daedc","Type":"ContainerStarted","Data":"f372a3abcbd7764e8e4b0469886554b111558589ebdef9a241273f27e3f6913d"} Apr 22 17:56:19.028004 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:19.027953 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zp6sd" podStartSLOduration=3.37013465 podStartE2EDuration="4.027940286s" podCreationTimestamp="2026-04-22 17:56:15 +0000 UTC" firstStartedPulling="2026-04-22 17:56:16.146128354 +0000 UTC m=+198.268238509" lastFinishedPulling="2026-04-22 17:56:16.803933775 +0000 UTC m=+198.926044145" observedRunningTime="2026-04-22 17:56:19.026754472 +0000 UTC m=+201.148864662" watchObservedRunningTime="2026-04-22 17:56:19.027940286 +0000 UTC m=+201.150050460" Apr 22 17:56:25.432692 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:25.432657 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86d85b6d7b-8hvcf"] Apr 22 17:56:25.433129 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:56:25.432846 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" podUID="d83afce0-9b9d-44a7-b334-d993784008e8" Apr 22 17:56:26.025675 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.025645 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:56:26.029491 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.029470 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:56:26.206010 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.205979 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-registry-certificates\") pod \"d83afce0-9b9d-44a7-b334-d993784008e8\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " Apr 22 17:56:26.206010 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206016 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-installation-pull-secrets\") pod \"d83afce0-9b9d-44a7-b334-d993784008e8\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " Apr 22 17:56:26.206252 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206048 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-trusted-ca\") pod \"d83afce0-9b9d-44a7-b334-d993784008e8\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " Apr 22 17:56:26.206252 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206075 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d83afce0-9b9d-44a7-b334-d993784008e8-ca-trust-extracted\") pod \"d83afce0-9b9d-44a7-b334-d993784008e8\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " Apr 22 17:56:26.206252 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206132 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qlzs\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-kube-api-access-8qlzs\") pod \"d83afce0-9b9d-44a7-b334-d993784008e8\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " Apr 22 17:56:26.206252 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206182 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-image-registry-private-configuration\") pod \"d83afce0-9b9d-44a7-b334-d993784008e8\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " Apr 22 17:56:26.206252 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206220 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-bound-sa-token\") pod \"d83afce0-9b9d-44a7-b334-d993784008e8\" (UID: \"d83afce0-9b9d-44a7-b334-d993784008e8\") " Apr 22 17:56:26.206516 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206399 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d83afce0-9b9d-44a7-b334-d993784008e8" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:56:26.206516 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206405 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d83afce0-9b9d-44a7-b334-d993784008e8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d83afce0-9b9d-44a7-b334-d993784008e8" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:56:26.206516 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206502 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-registry-certificates\") on node \"ip-10-0-135-143.ec2.internal\" DevicePath \"\"" Apr 22 17:56:26.206645 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206522 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d83afce0-9b9d-44a7-b334-d993784008e8-ca-trust-extracted\") on node \"ip-10-0-135-143.ec2.internal\" DevicePath \"\"" Apr 22 17:56:26.206645 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.206567 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d83afce0-9b9d-44a7-b334-d993784008e8" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:56:26.208464 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.208432 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-kube-api-access-8qlzs" (OuterVolumeSpecName: "kube-api-access-8qlzs") pod "d83afce0-9b9d-44a7-b334-d993784008e8" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8"). InnerVolumeSpecName "kube-api-access-8qlzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:56:26.208464 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.208454 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d83afce0-9b9d-44a7-b334-d993784008e8" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:56:26.208783 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.208766 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d83afce0-9b9d-44a7-b334-d993784008e8" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:56:26.208783 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.208767 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d83afce0-9b9d-44a7-b334-d993784008e8" (UID: "d83afce0-9b9d-44a7-b334-d993784008e8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:56:26.307427 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.307346 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qlzs\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-kube-api-access-8qlzs\") on node \"ip-10-0-135-143.ec2.internal\" DevicePath \"\"" Apr 22 17:56:26.307427 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.307374 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-image-registry-private-configuration\") on node \"ip-10-0-135-143.ec2.internal\" DevicePath \"\"" Apr 22 17:56:26.307427 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.307386 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-bound-sa-token\") on node \"ip-10-0-135-143.ec2.internal\" DevicePath \"\"" Apr 22 17:56:26.307427 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.307394 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d83afce0-9b9d-44a7-b334-d993784008e8-installation-pull-secrets\") on node \"ip-10-0-135-143.ec2.internal\" DevicePath \"\"" Apr 22 17:56:26.307427 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:26.307404 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d83afce0-9b9d-44a7-b334-d993784008e8-trusted-ca\") on node \"ip-10-0-135-143.ec2.internal\" DevicePath \"\"" Apr 22 17:56:27.027874 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:27.027843 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86d85b6d7b-8hvcf" Apr 22 17:56:27.074228 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:27.074200 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86d85b6d7b-8hvcf"] Apr 22 17:56:27.080100 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:27.080073 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-86d85b6d7b-8hvcf"] Apr 22 17:56:27.215551 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:27.215522 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d83afce0-9b9d-44a7-b334-d993784008e8-registry-tls\") on node \"ip-10-0-135-143.ec2.internal\" DevicePath \"\"" Apr 22 17:56:28.369635 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:28.369599 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d83afce0-9b9d-44a7-b334-d993784008e8" path="/var/lib/kubelet/pods/d83afce0-9b9d-44a7-b334-d993784008e8/volumes" Apr 22 17:56:40.654458 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:40.654394 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" podUID="2e2eb676-ea90-4f32-bd46-e3f71f3d49aa" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:56:50.654664 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:56:50.654620 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" podUID="2e2eb676-ea90-4f32-bd46-e3f71f3d49aa" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:57:00.654051 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:00.654009 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" podUID="2e2eb676-ea90-4f32-bd46-e3f71f3d49aa" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:57:00.654592 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:00.654089 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" Apr 22 17:57:00.654592 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:00.654557 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"8903071b5e9d622e61a0aa15e3ec562c47e188483e2af325afd232ae6c4c0dd6"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 17:57:00.654693 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:00.654618 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" podUID="2e2eb676-ea90-4f32-bd46-e3f71f3d49aa" containerName="service-proxy" containerID="cri-o://8903071b5e9d622e61a0aa15e3ec562c47e188483e2af325afd232ae6c4c0dd6" gracePeriod=30 Apr 22 17:57:01.107473 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:01.107385 2574 generic.go:358] "Generic (PLEG): container finished" podID="2e2eb676-ea90-4f32-bd46-e3f71f3d49aa" containerID="8903071b5e9d622e61a0aa15e3ec562c47e188483e2af325afd232ae6c4c0dd6" exitCode=2 Apr 22 17:57:01.107473 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:01.107450 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" event={"ID":"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa","Type":"ContainerDied","Data":"8903071b5e9d622e61a0aa15e3ec562c47e188483e2af325afd232ae6c4c0dd6"} Apr 22 17:57:01.107636 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:01.107483 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b659ffd86-b9xll" event={"ID":"2e2eb676-ea90-4f32-bd46-e3f71f3d49aa","Type":"ContainerStarted","Data":"42c8ad46f77d4b4b61fc2740542cc96d211c853cd18b03b9a4f79da48afacef1"} Apr 22 17:57:09.123567 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:09.123531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:57:09.127122 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:09.127089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c-metrics-certs\") pod \"network-metrics-daemon-hgdxx\" (UID: \"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c\") " pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:57:09.267276 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:09.267249 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qsx25\"" Apr 22 17:57:09.275643 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:09.275604 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgdxx" Apr 22 17:57:09.403032 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:09.402921 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgdxx"] Apr 22 17:57:09.405877 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:57:09.405847 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8531c9f8_7bc6_4bf2_a5f8_b7bd3a1c203c.slice/crio-f32849dd13a3d3d40a9294a128f36d48a33d18943510660d9bf8e4ced2e8838d WatchSource:0}: Error finding container f32849dd13a3d3d40a9294a128f36d48a33d18943510660d9bf8e4ced2e8838d: Status 404 returned error can't find the container with id f32849dd13a3d3d40a9294a128f36d48a33d18943510660d9bf8e4ced2e8838d Apr 22 17:57:10.132481 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:10.132436 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgdxx" event={"ID":"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c","Type":"ContainerStarted","Data":"f32849dd13a3d3d40a9294a128f36d48a33d18943510660d9bf8e4ced2e8838d"} Apr 22 17:57:11.137251 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:11.137176 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgdxx" event={"ID":"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c","Type":"ContainerStarted","Data":"545595c05dfdfa23557a50f3131459c6b39534e5f7ddfc4b0450fe4b8926d37b"} Apr 22 17:57:11.137251 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:11.137217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgdxx" event={"ID":"8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c","Type":"ContainerStarted","Data":"396b5c5c62235131206ef045910ca59321b0f657f657c551a8be8768080590e3"} Apr 22 17:57:36.882377 ip-10-0-135-143 kubenswrapper[2574]: E0422 17:57:36.882309 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-k7jtj" podUID="f672956a-1ca0-441f-a859-2d7cf55a77c5" Apr 22 17:57:37.203823 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:37.203748 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k7jtj" Apr 22 17:57:40.369667 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.369638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:57:40.370186 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.369693 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:57:40.371968 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.371943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7747539e-8e37-4640-80ce-2863534d185a-cert\") pod \"ingress-canary-b965g\" (UID: \"7747539e-8e37-4640-80ce-2863534d185a\") " pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:57:40.371968 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.371956 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f672956a-1ca0-441f-a859-2d7cf55a77c5-metrics-tls\") pod \"dns-default-k7jtj\" (UID: \"f672956a-1ca0-441f-a859-2d7cf55a77c5\") " pod="openshift-dns/dns-default-k7jtj" Apr 22 17:57:40.467792 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.467763 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-frn4n\"" Apr 22 17:57:40.475369 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.475351 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b965g" Apr 22 17:57:40.507001 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.506979 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5nxd\"" Apr 22 17:57:40.514406 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.514384 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k7jtj" Apr 22 17:57:40.599961 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.599906 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hgdxx" podStartSLOduration=281.388231627 podStartE2EDuration="4m42.59986217s" podCreationTimestamp="2026-04-22 17:52:58 +0000 UTC" firstStartedPulling="2026-04-22 17:57:09.407916771 +0000 UTC m=+251.530026924" lastFinishedPulling="2026-04-22 17:57:10.619547307 +0000 UTC m=+252.741657467" observedRunningTime="2026-04-22 17:57:11.159138392 +0000 UTC m=+253.281248568" watchObservedRunningTime="2026-04-22 17:57:40.59986217 +0000 UTC m=+282.721972346" Apr 22 17:57:40.601167 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.601146 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b965g"] Apr 22 17:57:40.604462 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:57:40.604439 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7747539e_8e37_4640_80ce_2863534d185a.slice/crio-8b37dffc09e7307b53f53642282e0ad33e1aff81487323cfce19a385a4f66173 WatchSource:0}: Error finding container 8b37dffc09e7307b53f53642282e0ad33e1aff81487323cfce19a385a4f66173: Status 404 returned error can't find the container with id 8b37dffc09e7307b53f53642282e0ad33e1aff81487323cfce19a385a4f66173 Apr 22 17:57:40.636387 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:40.636365 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k7jtj"] Apr 22 17:57:40.638088 ip-10-0-135-143 kubenswrapper[2574]: W0422 17:57:40.638063 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf672956a_1ca0_441f_a859_2d7cf55a77c5.slice/crio-f8014aa7805e269287493892d88861b2601b17c995ba221ef87cfa10439649d7 WatchSource:0}: Error finding container f8014aa7805e269287493892d88861b2601b17c995ba221ef87cfa10439649d7: Status 404 returned error can't find the container with id f8014aa7805e269287493892d88861b2601b17c995ba221ef87cfa10439649d7 Apr 22 17:57:41.214946 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:41.214879 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b965g" event={"ID":"7747539e-8e37-4640-80ce-2863534d185a","Type":"ContainerStarted","Data":"8b37dffc09e7307b53f53642282e0ad33e1aff81487323cfce19a385a4f66173"} Apr 22 17:57:41.216050 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:41.216022 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k7jtj" event={"ID":"f672956a-1ca0-441f-a859-2d7cf55a77c5","Type":"ContainerStarted","Data":"f8014aa7805e269287493892d88861b2601b17c995ba221ef87cfa10439649d7"} Apr 22 17:57:43.222984 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:43.222945 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b965g" event={"ID":"7747539e-8e37-4640-80ce-2863534d185a","Type":"ContainerStarted","Data":"33906b8865a31f5ef98b15b737b2692ca44a902a3203ab54954e0f3fcdf9a7e7"} Apr 22 17:57:43.224523 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:43.224502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k7jtj" event={"ID":"f672956a-1ca0-441f-a859-2d7cf55a77c5","Type":"ContainerStarted","Data":"45e24c48e9df97d490094527ca3ff151e55b8b3f5ff4d9110eac3d65ed2d8bc5"} Apr 22 17:57:43.224523 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:43.224527 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k7jtj" event={"ID":"f672956a-1ca0-441f-a859-2d7cf55a77c5","Type":"ContainerStarted","Data":"46f8e1afdceeb2f50c620be5dc9814077f543d2dd3388ac370a5cc86c8b5f475"} Apr 22 17:57:43.224649 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:43.224642 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-k7jtj" Apr 22 17:57:43.261300 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:43.261256 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b965g" podStartSLOduration=251.440099724 podStartE2EDuration="4m13.261244943s" podCreationTimestamp="2026-04-22 17:53:30 +0000 UTC" firstStartedPulling="2026-04-22 17:57:40.606789832 +0000 UTC m=+282.728900000" lastFinishedPulling="2026-04-22 17:57:42.427935066 +0000 UTC m=+284.550045219" observedRunningTime="2026-04-22 17:57:43.260092311 +0000 UTC m=+285.382202487" watchObservedRunningTime="2026-04-22 17:57:43.261244943 +0000 UTC m=+285.383355118" Apr 22 17:57:43.309428 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:43.309388 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k7jtj" podStartSLOduration=251.524481689 podStartE2EDuration="4m13.309375245s" podCreationTimestamp="2026-04-22 17:53:30 +0000 UTC" firstStartedPulling="2026-04-22 17:57:40.639764219 +0000 UTC m=+282.761874372" lastFinishedPulling="2026-04-22 17:57:42.424657773 +0000 UTC m=+284.546767928" observedRunningTime="2026-04-22 17:57:43.307787029 +0000 UTC m=+285.429897204" watchObservedRunningTime="2026-04-22 17:57:43.309375245 +0000 UTC m=+285.431485421" Apr 22 17:57:53.229640 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:53.229608 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k7jtj" Apr 22 17:57:58.280992 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:58.280958 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 17:57:58.281440 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:58.281092 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 17:57:58.285769 ip-10-0-135-143 kubenswrapper[2574]: I0422 17:57:58.285747 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:02:14.502427 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.502346 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9"] Apr 22 18:02:14.505526 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.505506 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:14.510458 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.510434 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-4b2b7\"" Apr 22 18:02:14.511641 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.511517 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:02:14.511641 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.511584 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:02:14.511764 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.511525 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:02:14.519451 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.519430 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9"] Apr 22 18:02:14.648636 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.648607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8k64\" (UniqueName: \"kubernetes.io/projected/27292c88-6b62-4441-a982-7817b18ea524-kube-api-access-x8k64\") pod \"llmisvc-controller-manager-68cc5db7c4-4sjj9\" (UID: \"27292c88-6b62-4441-a982-7817b18ea524\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:14.648814 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.648710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27292c88-6b62-4441-a982-7817b18ea524-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-4sjj9\" (UID: \"27292c88-6b62-4441-a982-7817b18ea524\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:14.749389 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.749355 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27292c88-6b62-4441-a982-7817b18ea524-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-4sjj9\" (UID: \"27292c88-6b62-4441-a982-7817b18ea524\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:14.749566 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.749412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8k64\" (UniqueName: \"kubernetes.io/projected/27292c88-6b62-4441-a982-7817b18ea524-kube-api-access-x8k64\") pod \"llmisvc-controller-manager-68cc5db7c4-4sjj9\" (UID: \"27292c88-6b62-4441-a982-7817b18ea524\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:14.751705 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.751680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27292c88-6b62-4441-a982-7817b18ea524-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-4sjj9\" (UID: \"27292c88-6b62-4441-a982-7817b18ea524\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:14.760343 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.760264 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8k64\" (UniqueName: \"kubernetes.io/projected/27292c88-6b62-4441-a982-7817b18ea524-kube-api-access-x8k64\") pod \"llmisvc-controller-manager-68cc5db7c4-4sjj9\" (UID: \"27292c88-6b62-4441-a982-7817b18ea524\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:14.815080 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.815050 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:14.929365 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.929293 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9"] Apr 22 18:02:14.931879 ip-10-0-135-143 kubenswrapper[2574]: W0422 18:02:14.931852 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod27292c88_6b62_4441_a982_7817b18ea524.slice/crio-c85f2b65e1ac9b72b5c9adeac8168096a4aecf13817b94bd74622e782d77079f WatchSource:0}: Error finding container c85f2b65e1ac9b72b5c9adeac8168096a4aecf13817b94bd74622e782d77079f: Status 404 returned error can't find the container with id c85f2b65e1ac9b72b5c9adeac8168096a4aecf13817b94bd74622e782d77079f Apr 22 18:02:14.933058 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:14.933042 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:02:15.900920 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:15.900883 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" event={"ID":"27292c88-6b62-4441-a982-7817b18ea524","Type":"ContainerStarted","Data":"c85f2b65e1ac9b72b5c9adeac8168096a4aecf13817b94bd74622e782d77079f"} Apr 22 18:02:17.907060 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:17.907024 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" event={"ID":"27292c88-6b62-4441-a982-7817b18ea524","Type":"ContainerStarted","Data":"4222e25951d901010f5f440745aadb8ee224c1e9ecd2c45adf9c827480f4a183"} Apr 22 18:02:17.907508 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:17.907146 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:17.942488 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:17.942446 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" podStartSLOduration=2.021400637 podStartE2EDuration="3.942432843s" podCreationTimestamp="2026-04-22 18:02:14 +0000 UTC" firstStartedPulling="2026-04-22 18:02:14.933170982 +0000 UTC m=+557.055281135" lastFinishedPulling="2026-04-22 18:02:16.854203186 +0000 UTC m=+558.976313341" observedRunningTime="2026-04-22 18:02:17.941817685 +0000 UTC m=+560.063927860" watchObservedRunningTime="2026-04-22 18:02:17.942432843 +0000 UTC m=+560.064543017" Apr 22 18:02:48.911809 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:48.911774 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-4sjj9" Apr 22 18:02:58.296792 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:58.296756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:02:58.297225 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:02:58.296894 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:03:23.588401 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.588309 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-cgfrr"] Apr 22 18:03:23.591363 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.591347 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:23.593824 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.593805 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:03:23.593911 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.593836 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-qvpnm\"" Apr 22 18:03:23.602530 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.602507 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cgfrr"] Apr 22 18:03:23.611108 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.611088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf2vv\" (UniqueName: \"kubernetes.io/projected/0fc7769d-9817-4836-937f-76b67a1a7cba-kube-api-access-rf2vv\") pod \"model-serving-api-86f7b4b499-cgfrr\" (UID: \"0fc7769d-9817-4836-937f-76b67a1a7cba\") " pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:23.611204 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.611126 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc7769d-9817-4836-937f-76b67a1a7cba-tls-certs\") pod \"model-serving-api-86f7b4b499-cgfrr\" (UID: \"0fc7769d-9817-4836-937f-76b67a1a7cba\") " pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:23.712427 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.712395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc7769d-9817-4836-937f-76b67a1a7cba-tls-certs\") pod \"model-serving-api-86f7b4b499-cgfrr\" (UID: \"0fc7769d-9817-4836-937f-76b67a1a7cba\") " pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:23.712613 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.712455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rf2vv\" (UniqueName: \"kubernetes.io/projected/0fc7769d-9817-4836-937f-76b67a1a7cba-kube-api-access-rf2vv\") pod \"model-serving-api-86f7b4b499-cgfrr\" (UID: \"0fc7769d-9817-4836-937f-76b67a1a7cba\") " pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:23.712613 ip-10-0-135-143 kubenswrapper[2574]: E0422 18:03:23.712561 2574 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 22 18:03:23.712723 ip-10-0-135-143 kubenswrapper[2574]: E0422 18:03:23.712652 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fc7769d-9817-4836-937f-76b67a1a7cba-tls-certs podName:0fc7769d-9817-4836-937f-76b67a1a7cba nodeName:}" failed. No retries permitted until 2026-04-22 18:03:24.212628261 +0000 UTC m=+626.334738417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/0fc7769d-9817-4836-937f-76b67a1a7cba-tls-certs") pod "model-serving-api-86f7b4b499-cgfrr" (UID: "0fc7769d-9817-4836-937f-76b67a1a7cba") : secret "model-serving-api-tls" not found Apr 22 18:03:23.722390 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:23.722362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf2vv\" (UniqueName: \"kubernetes.io/projected/0fc7769d-9817-4836-937f-76b67a1a7cba-kube-api-access-rf2vv\") pod \"model-serving-api-86f7b4b499-cgfrr\" (UID: \"0fc7769d-9817-4836-937f-76b67a1a7cba\") " pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:24.217089 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:24.217054 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc7769d-9817-4836-937f-76b67a1a7cba-tls-certs\") pod \"model-serving-api-86f7b4b499-cgfrr\" (UID: \"0fc7769d-9817-4836-937f-76b67a1a7cba\") " pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:24.219448 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:24.219430 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fc7769d-9817-4836-937f-76b67a1a7cba-tls-certs\") pod \"model-serving-api-86f7b4b499-cgfrr\" (UID: \"0fc7769d-9817-4836-937f-76b67a1a7cba\") " pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:24.501155 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:24.501068 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:24.615101 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:24.615074 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cgfrr"] Apr 22 18:03:24.617251 ip-10-0-135-143 kubenswrapper[2574]: W0422 18:03:24.617225 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fc7769d_9817_4836_937f_76b67a1a7cba.slice/crio-9a8bf065846d02e1d8643533971fc09b78cfa042b7eb84c53669d7553344dd66 WatchSource:0}: Error finding container 9a8bf065846d02e1d8643533971fc09b78cfa042b7eb84c53669d7553344dd66: Status 404 returned error can't find the container with id 9a8bf065846d02e1d8643533971fc09b78cfa042b7eb84c53669d7553344dd66 Apr 22 18:03:25.081676 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:25.081638 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cgfrr" event={"ID":"0fc7769d-9817-4836-937f-76b67a1a7cba","Type":"ContainerStarted","Data":"9a8bf065846d02e1d8643533971fc09b78cfa042b7eb84c53669d7553344dd66"} Apr 22 18:03:27.087885 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:27.087856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cgfrr" event={"ID":"0fc7769d-9817-4836-937f-76b67a1a7cba","Type":"ContainerStarted","Data":"6aec061fac5c0bdd8205dbe8c7c7cc21aed033f937b25f12b9a2d7a629a2492b"} Apr 22 18:03:27.088313 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:27.087957 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:03:27.104908 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:27.104859 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-cgfrr" podStartSLOduration=1.791096793 podStartE2EDuration="4.104847617s" podCreationTimestamp="2026-04-22 18:03:23 +0000 UTC" firstStartedPulling="2026-04-22 18:03:24.619400725 +0000 UTC m=+626.741510878" lastFinishedPulling="2026-04-22 18:03:26.933151549 +0000 UTC m=+629.055261702" observedRunningTime="2026-04-22 18:03:27.103866425 +0000 UTC m=+629.225976614" watchObservedRunningTime="2026-04-22 18:03:27.104847617 +0000 UTC m=+629.226957831" Apr 22 18:03:38.094227 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:03:38.094199 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-cgfrr" Apr 22 18:07:58.312689 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:07:58.312606 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:07:58.314100 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:07:58.314071 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:12:58.329217 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:12:58.329191 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:12:58.330778 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:12:58.330758 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:17:58.344998 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:17:58.344968 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:17:58.347493 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:17:58.347314 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:22:58.365220 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:22:58.365194 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:22:58.371164 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:22:58.367370 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:27:58.383649 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:27:58.383539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:27:58.387570 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:27:58.385579 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:32:58.399496 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:32:58.399403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:32:58.403445 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:32:58.401457 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:37:58.421573 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:37:58.421461 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:37:58.425574 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:37:58.425101 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:42:58.438101 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:42:58.437980 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:42:58.441969 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:42:58.441936 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:47:58.454661 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:47:58.454542 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:47:58.458675 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:47:58.458227 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:52:58.472370 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:52:58.472227 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:52:58.477162 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:52:58.475031 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:57:58.490966 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:57:58.490849 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 18:57:58.494990 ip-10-0-135-143 kubenswrapper[2574]: I0422 18:57:58.492518 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:02:58.505763 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:02:58.505643 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:02:58.509873 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:02:58.508340 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:07:58.521282 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:07:58.521164 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:07:58.525716 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:07:58.525695 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:12:58.537673 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:12:58.537548 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:12:58.541618 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:12:58.541601 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:17:58.554547 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:17:58.554433 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:17:58.558496 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:17:58.557376 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:22:58.572424 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:22:58.572283 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:22:58.577444 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:22:58.576458 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:27:31.000968 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.000888 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xtxx5/must-gather-rlc66"] Apr 22 19:27:31.004042 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.004025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtxx5/must-gather-rlc66" Apr 22 19:27:31.006589 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.006563 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtxx5\"/\"openshift-service-ca.crt\"" Apr 22 19:27:31.006717 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.006646 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtxx5\"/\"kube-root-ca.crt\"" Apr 22 19:27:31.007757 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.007713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xtxx5\"/\"default-dockercfg-jqt9f\"" Apr 22 19:27:31.010975 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.010793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtxx5/must-gather-rlc66"] Apr 22 19:27:31.114316 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.114279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca594791-4153-4254-9c73-ddd4c916dbf5-must-gather-output\") pod \"must-gather-rlc66\" (UID: \"ca594791-4153-4254-9c73-ddd4c916dbf5\") " pod="openshift-must-gather-xtxx5/must-gather-rlc66" Apr 22 19:27:31.114316 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.114347 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28nvn\" (UniqueName: \"kubernetes.io/projected/ca594791-4153-4254-9c73-ddd4c916dbf5-kube-api-access-28nvn\") pod \"must-gather-rlc66\" (UID: \"ca594791-4153-4254-9c73-ddd4c916dbf5\") " pod="openshift-must-gather-xtxx5/must-gather-rlc66" Apr 22 19:27:31.215321 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.215290 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28nvn\" (UniqueName: \"kubernetes.io/projected/ca594791-4153-4254-9c73-ddd4c916dbf5-kube-api-access-28nvn\") pod \"must-gather-rlc66\" (UID: \"ca594791-4153-4254-9c73-ddd4c916dbf5\") " pod="openshift-must-gather-xtxx5/must-gather-rlc66" Apr 22 19:27:31.215452 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.215367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca594791-4153-4254-9c73-ddd4c916dbf5-must-gather-output\") pod \"must-gather-rlc66\" (UID: \"ca594791-4153-4254-9c73-ddd4c916dbf5\") " pod="openshift-must-gather-xtxx5/must-gather-rlc66" Apr 22 19:27:31.215649 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.215635 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca594791-4153-4254-9c73-ddd4c916dbf5-must-gather-output\") pod \"must-gather-rlc66\" (UID: \"ca594791-4153-4254-9c73-ddd4c916dbf5\") " pod="openshift-must-gather-xtxx5/must-gather-rlc66" Apr 22 19:27:31.223272 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.223246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28nvn\" (UniqueName: \"kubernetes.io/projected/ca594791-4153-4254-9c73-ddd4c916dbf5-kube-api-access-28nvn\") pod \"must-gather-rlc66\" (UID: \"ca594791-4153-4254-9c73-ddd4c916dbf5\") " pod="openshift-must-gather-xtxx5/must-gather-rlc66" Apr 22 19:27:31.314037 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.313963 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtxx5/must-gather-rlc66" Apr 22 19:27:31.429950 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.429916 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtxx5/must-gather-rlc66"] Apr 22 19:27:31.432613 ip-10-0-135-143 kubenswrapper[2574]: W0422 19:27:31.432570 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca594791_4153_4254_9c73_ddd4c916dbf5.slice/crio-f6ac51a745337e3eb75e6a26149d4cdecc0d5fbc2676cf3e6cf2a09973dc44ab WatchSource:0}: Error finding container f6ac51a745337e3eb75e6a26149d4cdecc0d5fbc2676cf3e6cf2a09973dc44ab: Status 404 returned error can't find the container with id f6ac51a745337e3eb75e6a26149d4cdecc0d5fbc2676cf3e6cf2a09973dc44ab Apr 22 19:27:31.434203 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.434187 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:27:31.822866 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:31.822831 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtxx5/must-gather-rlc66" event={"ID":"ca594791-4153-4254-9c73-ddd4c916dbf5","Type":"ContainerStarted","Data":"f6ac51a745337e3eb75e6a26149d4cdecc0d5fbc2676cf3e6cf2a09973dc44ab"} Apr 22 19:27:32.828805 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:32.828713 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtxx5/must-gather-rlc66" event={"ID":"ca594791-4153-4254-9c73-ddd4c916dbf5","Type":"ContainerStarted","Data":"5e837c6d240796affe39526d56511821f89c8859e2e2c96cc5f53388ccfc036b"} Apr 22 19:27:32.828805 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:32.828753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtxx5/must-gather-rlc66" event={"ID":"ca594791-4153-4254-9c73-ddd4c916dbf5","Type":"ContainerStarted","Data":"3fceca73909ebfbb6337fbd3f82cc9c87becf9c003e793d8e31c9e604feccf67"} Apr 22 19:27:32.846496 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:32.846449 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xtxx5/must-gather-rlc66" podStartSLOduration=1.995812753 podStartE2EDuration="2.84643526s" podCreationTimestamp="2026-04-22 19:27:30 +0000 UTC" firstStartedPulling="2026-04-22 19:27:31.434310475 +0000 UTC m=+5673.556420629" lastFinishedPulling="2026-04-22 19:27:32.284932979 +0000 UTC m=+5674.407043136" observedRunningTime="2026-04-22 19:27:32.844986878 +0000 UTC m=+5674.967097064" watchObservedRunningTime="2026-04-22 19:27:32.84643526 +0000 UTC m=+5674.968545435" Apr 22 19:27:33.652346 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:33.652302 2574 ???:1] "http: TLS handshake error from 10.0.132.24:47990: EOF" Apr 22 19:27:33.663677 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:33.663651 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wvndp_e29345c1-1d57-44eb-aacb-0f51d483baf0/global-pull-secret-syncer/0.log" Apr 22 19:27:33.809824 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:33.809795 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mp6ht_e7747e6a-0f9c-48fa-a8e9-4648a187366c/konnectivity-agent/0.log" Apr 22 19:27:33.878308 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:33.878277 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-143.ec2.internal_5b16281a969b22277334b5b3f7efb159/haproxy/0.log" Apr 22 19:27:37.567023 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:37.566992 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zp6sd_075f4a8c-dce4-4d3f-bcbe-b60f503daedc/node-exporter/0.log" Apr 22 19:27:37.590305 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:37.590252 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zp6sd_075f4a8c-dce4-4d3f-bcbe-b60f503daedc/kube-rbac-proxy/0.log" Apr 22 19:27:37.612408 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:37.612383 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zp6sd_075f4a8c-dce4-4d3f-bcbe-b60f503daedc/init-textfile/0.log" Apr 22 19:27:40.595575 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.595543 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78"] Apr 22 19:27:40.600124 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.600091 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.609100 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.608653 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78"] Apr 22 19:27:40.691379 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.691314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-proc\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.691562 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.691416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-lib-modules\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.691562 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.691459 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-sys\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.691562 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.691544 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-podres\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.691741 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.691585 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44r7r\" (UniqueName: \"kubernetes.io/projected/add4e3e0-ea04-45a0-9b83-8908637781db-kube-api-access-44r7r\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.792168 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.792133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-lib-modules\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.792168 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.792173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-sys\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.792437 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.792272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-podres\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.792437 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.792306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44r7r\" (UniqueName: \"kubernetes.io/projected/add4e3e0-ea04-45a0-9b83-8908637781db-kube-api-access-44r7r\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.792437 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.792323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-lib-modules\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.792607 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.792432 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-sys\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.792607 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.792466 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-podres\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.792712 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.792643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-proc\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.792771 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.792725 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/add4e3e0-ea04-45a0-9b83-8908637781db-proc\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.801667 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.801638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44r7r\" (UniqueName: \"kubernetes.io/projected/add4e3e0-ea04-45a0-9b83-8908637781db-kube-api-access-44r7r\") pod \"perf-node-gather-daemonset-trq78\" (UID: \"add4e3e0-ea04-45a0-9b83-8908637781db\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:40.913185 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:40.913156 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:41.052370 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:41.052300 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78"] Apr 22 19:27:41.054391 ip-10-0-135-143 kubenswrapper[2574]: W0422 19:27:41.054365 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podadd4e3e0_ea04_45a0_9b83_8908637781db.slice/crio-1e73bf3fe5479f2357ef9a927c30ec0d2935b10f6f87871e5706725fcb8107b3 WatchSource:0}: Error finding container 1e73bf3fe5479f2357ef9a927c30ec0d2935b10f6f87871e5706725fcb8107b3: Status 404 returned error can't find the container with id 1e73bf3fe5479f2357ef9a927c30ec0d2935b10f6f87871e5706725fcb8107b3 Apr 22 19:27:41.146928 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:41.146909 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k7jtj_f672956a-1ca0-441f-a859-2d7cf55a77c5/dns/0.log" Apr 22 19:27:41.164944 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:41.164897 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k7jtj_f672956a-1ca0-441f-a859-2d7cf55a77c5/kube-rbac-proxy/0.log" Apr 22 19:27:41.184446 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:41.184429 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8jbmf_0e63511d-3bbf-4d8b-bff0-7c1cc73c694c/dns-node-resolver/0.log" Apr 22 19:27:41.672898 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:41.672874 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wl6lv_f8bb80bf-d436-4bad-a3bf-b26dcc359766/node-ca/0.log" Apr 22 19:27:41.863657 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:41.863621 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" event={"ID":"add4e3e0-ea04-45a0-9b83-8908637781db","Type":"ContainerStarted","Data":"dc73877030bb480d71862f52d259496adafd7cd7d81e8d4a3bdb3cac5b9d1750"} Apr 22 19:27:41.863657 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:41.863656 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" event={"ID":"add4e3e0-ea04-45a0-9b83-8908637781db","Type":"ContainerStarted","Data":"1e73bf3fe5479f2357ef9a927c30ec0d2935b10f6f87871e5706725fcb8107b3"} Apr 22 19:27:41.863847 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:41.863680 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:41.879006 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:41.878951 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" podStartSLOduration=1.878935302 podStartE2EDuration="1.878935302s" podCreationTimestamp="2026-04-22 19:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:27:41.877815209 +0000 UTC m=+5683.999925386" watchObservedRunningTime="2026-04-22 19:27:41.878935302 +0000 UTC m=+5684.001045479" Apr 22 19:27:42.641244 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:42.641204 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b965g_7747539e-8e37-4640-80ce-2863534d185a/serve-healthcheck-canary/0.log" Apr 22 19:27:43.147163 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:43.147138 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tw8jw_015ddcfd-46c1-460a-bfbe-a6cbe3eed75e/kube-rbac-proxy/0.log" Apr 22 19:27:43.166736 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:43.166715 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tw8jw_015ddcfd-46c1-460a-bfbe-a6cbe3eed75e/exporter/0.log" Apr 22 19:27:43.186553 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:43.186531 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tw8jw_015ddcfd-46c1-460a-bfbe-a6cbe3eed75e/extractor/0.log" Apr 22 19:27:45.043940 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:45.043906 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-4sjj9_27292c88-6b62-4441-a982-7817b18ea524/manager/0.log" Apr 22 19:27:45.068771 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:45.068744 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-cgfrr_0fc7769d-9817-4836-937f-76b67a1a7cba/server/0.log" Apr 22 19:27:47.877393 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:47.876545 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-trq78" Apr 22 19:27:50.190967 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.190938 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-555gb_aead7c8c-4dc1-4092-8e69-4f857803c825/kube-multus/0.log" Apr 22 19:27:50.427344 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.427299 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v2plq_3848ad83-45cd-43a9-8803-52cd31ab6f05/kube-multus-additional-cni-plugins/0.log" Apr 22 19:27:50.448127 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.448050 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v2plq_3848ad83-45cd-43a9-8803-52cd31ab6f05/egress-router-binary-copy/0.log" Apr 22 19:27:50.467083 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.467059 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v2plq_3848ad83-45cd-43a9-8803-52cd31ab6f05/cni-plugins/0.log" Apr 22 19:27:50.487408 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.487391 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v2plq_3848ad83-45cd-43a9-8803-52cd31ab6f05/bond-cni-plugin/0.log" Apr 22 19:27:50.506691 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.506672 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v2plq_3848ad83-45cd-43a9-8803-52cd31ab6f05/routeoverride-cni/0.log" Apr 22 19:27:50.529996 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.529979 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v2plq_3848ad83-45cd-43a9-8803-52cd31ab6f05/whereabouts-cni-bincopy/0.log" Apr 22 19:27:50.554148 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.554128 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v2plq_3848ad83-45cd-43a9-8803-52cd31ab6f05/whereabouts-cni/0.log" Apr 22 19:27:50.858852 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.858760 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hgdxx_8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c/network-metrics-daemon/0.log" Apr 22 19:27:50.877815 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:50.877789 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hgdxx_8531c9f8-7bc6-4bf2-a5f8-b7bd3a1c203c/kube-rbac-proxy/0.log" Apr 22 19:27:52.003374 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:52.003344 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-controller/0.log" Apr 22 19:27:52.031843 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:52.031814 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/0.log" Apr 22 19:27:52.056892 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:52.056867 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovn-acl-logging/1.log" Apr 22 19:27:52.075695 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:52.075665 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/kube-rbac-proxy-node/0.log" Apr 22 19:27:52.095339 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:52.095242 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:27:52.115437 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:52.115420 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/northd/0.log" Apr 22 19:27:52.134594 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:52.134578 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/nbdb/0.log" Apr 22 19:27:52.153736 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:52.153714 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/sbdb/0.log" Apr 22 19:27:52.252064 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:52.252035 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qv8jk_b77e864f-03d8-422c-8cfd-a0a44bbce6e2/ovnkube-controller/0.log" Apr 22 19:27:53.500890 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:53.500863 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-sbf9k_02186aeb-7b19-4faf-885c-e060403bd058/network-check-target-container/0.log" Apr 22 19:27:54.392928 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:54.392889 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rlmgg_15e19007-4d32-4de6-9d4e-fbfa1c190965/iptables-alerter/0.log" Apr 22 19:27:55.004941 ip-10-0-135-143 kubenswrapper[2574]: I0422 19:27:55.004917 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-gtxxh_925c84c5-beea-448d-af92-aa9ab7a10629/tuned/0.log"