Apr 17 11:13:30.270643 ip-10-0-141-16 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 11:13:30.270654 ip-10-0-141-16 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 11:13:30.270661 ip-10-0-141-16 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 11:13:30.270862 ip-10-0-141-16 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 11:13:40.468241 ip-10-0-141-16 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 11:13:40.468260 ip-10-0-141-16 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 13dcd788af6a463bad077bfbe9253f5d -- Apr 17 11:16:06.947374 ip-10-0-141-16 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:07.437922 ip-10-0-141-16 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:07.437922 ip-10-0-141-16 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:07.437922 ip-10-0-141-16 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:07.437922 ip-10-0-141-16 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:07.437922 ip-10-0-141-16 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:07.438903 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.438839 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:07.445028 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445015 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:07.445028 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445028 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445033 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445036 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445039 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445042 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445045 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445047 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445050 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445053 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445056 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445059 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445061 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445064 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445066 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445069 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445071 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445074 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445076 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445079 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445081 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:07.445095 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445083 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445086 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445088 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445091 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445093 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445095 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445098 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445101 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445104 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445106 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445109 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445111 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445130 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445133 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445136 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445138 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445141 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445143 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445146 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445148 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:07.445563 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445157 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445160 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445162 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445165 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445167 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445169 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445172 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445174 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445176 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445179 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445181 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445183 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445185 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445190 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445194 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445197 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445199 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445202 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445204 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:07.446029 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445207 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445209 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445212 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445215 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445217 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445220 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445227 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445230 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445233 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445235 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445239 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445241 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445243 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445246 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445248 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445250 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445253 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445255 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445258 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445260 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445262 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:07.446495 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445265 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445267 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445271 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445274 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445277 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445649 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445654 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445658 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445661 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445664 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445667 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445669 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445672 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445675 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445677 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445680 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445682 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445685 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445688 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:07.447011 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445690 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445694 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445698 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445700 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445703 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445705 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445708 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445710 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445713 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445716 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445718 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445721 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445723 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445725 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445728 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445730 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445733 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445735 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445738 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:07.447462 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445741 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445744 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445746 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445749 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445752 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445754 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445756 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445759 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445761 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445764 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445766 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445769 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445772 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445774 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445777 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445779 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445782 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445784 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445786 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445789 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:07.447921 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445791 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445794 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445796 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445798 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445801 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445803 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445805 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445808 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445811 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445814 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445817 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445819 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445823 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445826 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445829 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445831 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445834 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445836 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445838 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:07.448645 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445841 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445843 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445846 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445848 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445851 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445853 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445856 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445858 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445861 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445863 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445865 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445869 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445871 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.445873 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445939 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445945 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445952 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445956 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445960 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445964 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445968 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:07.449210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445972 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445975 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445978 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445982 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445985 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445988 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445991 2571 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445994 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.445997 2571 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446000 2571 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446002 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446005 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446009 2571 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446012 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446015 2571 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446017 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446020 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446024 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446027 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446031 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446034 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446037 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446040 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446043 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446046 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:07.449704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446049 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446052 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446055 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446058 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446061 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446064 2571 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446066 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446071 2571 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446074 2571 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446076 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446080 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446083 2571 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446087 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446090 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446093 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446097 2571 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446100 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446102 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446105 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446108 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446111 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446126 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446130 2571 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446133 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446136 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:07.450335 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446139 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446143 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446146 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446149 2571 flags.go:64] FLAG: --help="false" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446152 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446155 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446157 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446160 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446164 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446167 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446170 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446172 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446175 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446178 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446181 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446184 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446187 2571 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446190 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446193 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446196 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446199 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446202 2571 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446205 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446208 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:07.450969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446211 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446216 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446218 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446221 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446224 2571 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446227 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446230 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446233 2571 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446235 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446240 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446244 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446253 2571 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446255 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446258 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446261 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446264 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446267 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446270 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446272 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446280 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446283 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446286 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446291 2571 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:07.451548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446294 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446299 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446304 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446309 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446312 2571 flags.go:64] FLAG: --port="10250" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446314 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446317 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c1ddde5ceb9a1d3a" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446320 2571 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446323 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446326 2571 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446329 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446331 2571 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446335 2571 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446337 2571 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446340 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446343 2571 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446346 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446349 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446352 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446355 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446358 2571 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446361 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446364 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446366 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446369 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446372 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:07.452150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446375 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446378 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446381 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446383 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446386 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446389 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446391 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446394 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446398 2571 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446402 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446407 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446410 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446413 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446417 2571 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446419 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446422 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446425 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446428 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446430 2571 flags.go:64] FLAG: --v="2" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446434 2571 flags.go:64] FLAG: --version="false" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446438 2571 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446442 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.446445 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446523 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:07.452743 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446527 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446530 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446533 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446536 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446538 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446541 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446544 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446546 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446549 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446551 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446554 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446556 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446559 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446561 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446564 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446566 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446570 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446574 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446576 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446579 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:07.453309 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446582 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446584 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446587 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446589 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446592 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446594 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446597 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446599 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446601 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446604 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446606 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446609 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446612 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446615 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446619 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446622 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446625 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446628 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446631 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:07.453791 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446634 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446636 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446639 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446641 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446644 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446646 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446649 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446651 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446654 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446658 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446662 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446665 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446667 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446670 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446673 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446675 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446678 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446680 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446683 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446685 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:07.454272 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446687 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446690 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446692 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446695 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446697 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446700 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446702 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446705 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446707 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446709 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446712 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446715 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446717 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446720 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446722 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446725 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446728 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446732 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446735 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446737 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:07.454768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446740 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:07.455349 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446744 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:07.455349 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446748 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:07.455349 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446750 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:07.455349 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446753 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:07.455349 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.446755 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:07.455349 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.447409 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:07.455784 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.455767 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:07.455812 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.455785 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:07.455841 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455829 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:07.455841 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455834 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:07.455841 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455837 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:07.455841 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455839 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:07.455841 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455842 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455845 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455848 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455850 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455853 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455856 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455858 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455861 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455863 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455866 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455869 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455871 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455875 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455879 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455882 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455886 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455890 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455894 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455896 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:07.455960 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455899 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455902 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455905 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455907 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455910 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455912 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455914 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455917 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455920 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455922 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455924 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455927 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455929 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455932 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455934 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455938 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455940 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455943 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455946 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455948 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:07.456436 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455950 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455953 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455956 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455959 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455961 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455964 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455967 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455970 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455972 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455975 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455977 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455980 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455982 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455985 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455987 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455990 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455992 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455994 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455997 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.455999 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:07.456983 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456002 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456005 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456007 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456010 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456012 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456014 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456017 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456020 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456023 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456025 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456028 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456030 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456033 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456036 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456039 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456053 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456056 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456059 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456062 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456064 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:07.457475 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456067 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456069 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456072 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.456077 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456178 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456184 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456187 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456190 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456193 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456196 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456199 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456201 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456206 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456209 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:07.457941 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456212 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456215 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456217 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456220 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456223 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456226 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456228 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456231 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456233 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456236 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456238 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456241 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456244 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456247 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456249 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456252 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456256 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456258 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456261 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456263 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:07.458296 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456266 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456268 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456271 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456273 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456275 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456278 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456280 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456283 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456285 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456287 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456290 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456292 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456294 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456297 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456299 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456302 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456304 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456307 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456310 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456312 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:07.458768 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456314 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456317 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456320 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456322 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456325 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456328 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456331 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456333 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456335 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456338 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456340 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456343 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456345 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456347 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456350 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456352 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456355 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456357 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456360 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456362 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:07.459289 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456364 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456367 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456369 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456371 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456374 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456376 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456394 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456398 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456401 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456404 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456407 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456409 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456411 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456414 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456416 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:07.456418 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:07.459750 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.456423 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:07.460160 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.457068 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:07.460160 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.459668 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:07.460856 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.460845 2571 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:07.460964 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.460948 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:07.460999 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.460984 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:07.486532 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.486515 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:07.489184 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.489165 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:07.504215 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.504196 2571 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:07.510058 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.510043 2571 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:07.511333 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.511318 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:07.516156 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.516136 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:07.516642 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.516620 2571 fs.go:135] Filesystem UUIDs: map[29628457-e195-4148-9fa4-177545423c2d:/dev/nvme0n1p4 357e6c51-94d3-4eca-bb2e-dcabff3665d1:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 11:16:07.516718 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.516641 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:07.522381 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.522271 2571 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:07.520202387 +0000 UTC m=+0.436727304 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3188688 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22460f406d36671f3e387dbe7b998f SystemUUID:ec22460f-406d-3667-1f3e-387dbe7b998f BootID:13dcd788-af6a-463b-ad07-7bfbe9253f5d Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ce:4a:16:ef:11 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ce:4a:16:ef:11 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:f6:b9:79:88:4e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:07.522381 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.522369 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:07.522539 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.522457 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:07.523673 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.523643 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:07.523818 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.523675 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-16.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:07.523907 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.523831 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:07.523907 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.523843 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:07.523907 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.523860 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:07.524688 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.524675 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:07.525468 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.525456 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:07.525590 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.525579 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:07.528210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.528199 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:07.528278 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.528221 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:07.528278 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.528238 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:07.528278 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.528250 2571 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:07.528278 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.528261 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:07.529363 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.529350 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:07.529425 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.529372 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:07.532664 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.532650 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:07.533874 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.533857 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:07.535537 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535524 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:07.535616 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535543 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:07.535616 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535553 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:07.535616 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535564 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:07.535616 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535572 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:07.535616 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535580 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:07.535616 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535588 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:07.535616 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535596 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:07.535616 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535605 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:07.535616 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535614 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:07.535867 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535637 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:07.535867 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.535651 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:07.537334 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.537323 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:07.537397 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.537337 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:07.537397 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.537340 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dbhcf" Apr 17 11:16:07.540592 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.540573 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:07.540666 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.540615 2571 server.go:1295] "Started kubelet" Apr 17 11:16:07.541154 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.541093 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:07.541154 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.541132 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:07.541291 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.541168 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:07.541387 ip-10-0-141-16 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:07.542092 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.542077 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:07.542603 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.542586 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:07.542603 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.542594 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-16.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:07.542722 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.542663 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-16.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:07.544019 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.544004 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:07.544586 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.544562 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dbhcf" Apr 17 11:16:07.549926 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.546047 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-16.ec2.internal.18a720bb768896e2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-16.ec2.internal,UID:ip-10-0-141-16.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-16.ec2.internal,},FirstTimestamp:2026-04-17 11:16:07.54058621 +0000 UTC m=+0.457111128,LastTimestamp:2026-04-17 11:16:07.54058621 +0000 UTC m=+0.457111128,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-16.ec2.internal,}" Apr 17 11:16:07.551285 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.551261 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:07.551420 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.551402 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:07.551851 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.551835 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:07.552604 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552437 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:07.552604 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552451 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:07.552604 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552581 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:07.552740 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.552693 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:07.552830 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552810 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:07.552830 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552832 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:07.552971 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552863 2571 factory.go:153] Registering CRI-O factory Apr 17 11:16:07.552971 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552918 2571 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:07.552971 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552968 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:07.553101 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552979 2571 factory.go:55] Registering systemd factory Apr 17 11:16:07.553101 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.552987 2571 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:07.553101 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.553009 2571 factory.go:103] Registering Raw factory Apr 17 11:16:07.553101 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.553023 2571 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:07.553547 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.553529 2571 manager.go:319] Starting recovery of all containers Apr 17 11:16:07.553609 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.553582 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:07.555976 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.555950 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-16.ec2.internal\" not found" node="ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.562961 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.562850 2571 manager.go:324] Recovery completed Apr 17 11:16:07.566774 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.566763 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:07.568909 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.568897 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:07.569010 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.568920 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:07.569010 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.568931 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:07.569350 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.569340 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:07.569399 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.569349 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:07.569399 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.569365 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:07.571327 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.571316 2571 policy_none.go:49] "None policy: Start" Apr 17 11:16:07.571370 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.571331 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:07.571370 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.571340 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.613351 2571 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.613372 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.613381 2571 server.go:85] "Starting device plugin registration server" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.613549 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.613557 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.613631 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.613689 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.613694 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.614152 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:07.621219 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.614189 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:07.680274 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.680243 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:07.681341 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.681325 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:07.681422 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.681346 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:07.681422 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.681360 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:07.681422 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.681366 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:07.681422 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.681391 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:07.684973 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.684957 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:07.714452 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.714408 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:07.715224 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.715203 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:07.715302 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.715232 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:07.715302 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.715243 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:07.715302 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.715261 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.723153 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.723112 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.723202 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.723160 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-16.ec2.internal\": node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:07.745369 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.745348 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:07.782057 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.782024 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal"] Apr 17 11:16:07.782142 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.782088 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:07.782820 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.782805 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:07.782897 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.782832 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:07.782897 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.782843 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:07.784000 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.783989 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:07.784160 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.784146 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.784201 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.784174 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:07.784637 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.784623 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:07.784693 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.784627 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:07.784693 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.784670 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:07.784693 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.784679 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:07.784770 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.784650 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:07.784770 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.784720 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:07.785634 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.785619 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.785696 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.785644 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:07.786243 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.786230 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:07.786317 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.786257 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:07.786317 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.786271 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:07.814595 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.814575 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-16.ec2.internal\" not found" node="ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.818862 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.818847 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-16.ec2.internal\" not found" node="ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.845612 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.845594 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:07.854210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.854194 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/237efac7542ae805317afa8331e5e27b-config\") pod \"kube-apiserver-proxy-ip-10-0-141-16.ec2.internal\" (UID: \"237efac7542ae805317afa8331e5e27b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.854282 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.854218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.854282 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.854233 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.946031 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:07.946011 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:07.954360 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.954341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/237efac7542ae805317afa8331e5e27b-config\") pod \"kube-apiserver-proxy-ip-10-0-141-16.ec2.internal\" (UID: \"237efac7542ae805317afa8331e5e27b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.954421 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.954366 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.954421 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.954383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.954421 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.954417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.954538 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.954431 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/237efac7542ae805317afa8331e5e27b-config\") pod \"kube-apiserver-proxy-ip-10-0-141-16.ec2.internal\" (UID: \"237efac7542ae805317afa8331e5e27b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 17 11:16:07.954538 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:07.954456 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/27d97c8c240a436d06b1c4f45cd224be-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal\" (UID: \"27d97c8c240a436d06b1c4f45cd224be\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 17 11:16:08.046727 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:08.046676 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:08.116174 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.116156 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 17 11:16:08.121659 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.121639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 17 11:16:08.147351 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:08.147325 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:08.247837 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:08.247802 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:08.348311 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:08.348261 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:08.448878 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:08.448861 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:08.460302 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.460287 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:08.460433 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.460414 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:08.460469 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.460434 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:08.548587 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.548557 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:07 +0000 UTC" deadline="2027-10-13 09:35:22.356091336 +0000 UTC" Apr 17 11:16:08.548587 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.548585 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13054h19m13.80750957s" Apr 17 11:16:08.549662 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:08.549642 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:08.552176 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.552159 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:08.562095 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.562077 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:08.580234 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.580216 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jzqrq" Apr 17 11:16:08.587726 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.587710 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jzqrq" Apr 17 11:16:08.626408 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:08.626376 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod237efac7542ae805317afa8331e5e27b.slice/crio-4dedd7020d9c87fea1ca07a130a83eae680c06da82af4299e64f931cbf547b3a WatchSource:0}: Error finding container 4dedd7020d9c87fea1ca07a130a83eae680c06da82af4299e64f931cbf547b3a: Status 404 returned error can't find the container with id 4dedd7020d9c87fea1ca07a130a83eae680c06da82af4299e64f931cbf547b3a Apr 17 11:16:08.626623 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:08.626605 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d97c8c240a436d06b1c4f45cd224be.slice/crio-7475a381cd94339aae9b6df50e8019ecf1702903d5e046892bf98dc350a1ce44 WatchSource:0}: Error finding container 7475a381cd94339aae9b6df50e8019ecf1702903d5e046892bf98dc350a1ce44: Status 404 returned error can't find the container with id 7475a381cd94339aae9b6df50e8019ecf1702903d5e046892bf98dc350a1ce44 Apr 17 11:16:08.629996 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.629978 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:08.649890 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:08.649868 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:08.683807 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.683761 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" event={"ID":"237efac7542ae805317afa8331e5e27b","Type":"ContainerStarted","Data":"4dedd7020d9c87fea1ca07a130a83eae680c06da82af4299e64f931cbf547b3a"} Apr 17 11:16:08.684623 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.684607 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" event={"ID":"27d97c8c240a436d06b1c4f45cd224be","Type":"ContainerStarted","Data":"7475a381cd94339aae9b6df50e8019ecf1702903d5e046892bf98dc350a1ce44"} Apr 17 11:16:08.728540 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.728522 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:08.750912 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:08.750893 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-16.ec2.internal\" not found" Apr 17 11:16:08.792062 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.792034 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:08.853059 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.853041 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" Apr 17 11:16:08.864671 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.864657 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:08.865604 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.865574 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" Apr 17 11:16:08.873953 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:08.873939 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:09.529506 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.529479 2571 apiserver.go:52] "Watching apiserver" Apr 17 11:16:09.536628 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.536605 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:09.538799 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.538775 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zc47d","openshift-network-operator/iptables-alerter-npwhk","openshift-ovn-kubernetes/ovnkube-node-xmrcg","kube-system/konnectivity-agent-5pxkv","kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t","openshift-image-registry/node-ca-lqhrs","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal","openshift-multus/network-metrics-daemon-6zdmq","openshift-network-diagnostics/network-check-target-w7npk","openshift-cluster-node-tuning-operator/tuned-gh4f5","openshift-dns/node-resolver-k8djp","openshift-multus/multus-additional-cni-plugins-d76ls"] Apr 17 11:16:09.541371 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.541347 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.541463 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.541390 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.542602 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.542584 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.543733 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.543712 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:09.544304 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.544289 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:09.544304 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.544298 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:09.544720 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.544699 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2flrb\"" Apr 17 11:16:09.544887 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.544856 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:09.544992 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.544920 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:09.545642 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.545621 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9sgx7\"" Apr 17 11:16:09.547129 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.545860 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:09.547129 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.546442 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:09.547129 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.546543 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:09.547129 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.546699 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:09.547352 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.547169 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:09.547352 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.547258 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:09.547806 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.547787 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:09.547806 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.547804 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9sgbr\"" Apr 17 11:16:09.549011 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.548565 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-qqsxn\"" Apr 17 11:16:09.549011 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.548652 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:09.549297 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.549249 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:09.550089 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.550072 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:09.550329 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.549594 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.552061 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.552040 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.552180 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.552135 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:09.552239 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:09.552204 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:09.553008 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.552899 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:09.553008 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.552971 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-q7f9b\"" Apr 17 11:16:09.553008 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.552979 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:09.553008 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.552983 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:09.553761 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.553743 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:09.553901 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:09.553878 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:09.554427 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.554408 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:09.554506 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.554416 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:09.554506 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.554470 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dptdg\"" Apr 17 11:16:09.554770 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.554752 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:09.555068 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.555049 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:09.555209 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.555193 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.556570 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.556550 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.557488 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.557469 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:09.557488 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.557483 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mbtv5\"" Apr 17 11:16:09.557624 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.557528 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:09.557877 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.557861 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.558793 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.558775 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:09.559238 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.559221 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:09.559327 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.559246 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-s9wql\"" Apr 17 11:16:09.560274 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.560254 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:09.560461 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.560395 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:09.560614 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.560601 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-77pwz\"" Apr 17 11:16:09.562575 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-os-release\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.562664 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562653 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e8a849a2-893a-4e45-b82e-22ee8ac74d6e-host-slash\") pod \"iptables-alerter-npwhk\" (UID: \"e8a849a2-893a-4e45-b82e-22ee8ac74d6e\") " pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.562720 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562703 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06811185-7a8c-419b-9f44-d67b67d794d3-host\") pod \"node-ca-lqhrs\" (UID: \"06811185-7a8c-419b-9f44-d67b67d794d3\") " pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.562778 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562722 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-var-lib-openvswitch\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.562824 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562776 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/670fcc7b-8343-46ca-b1b5-00040742a8e8-agent-certs\") pod \"konnectivity-agent-5pxkv\" (UID: \"670fcc7b-8343-46ca-b1b5-00040742a8e8\") " pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:09.562824 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562798 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.562918 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562857 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:09.562918 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562897 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-etc-openvswitch\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563012 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562923 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-cni-bin\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563012 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562945 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1fa09637-267c-4a4b-8aac-54287c81cc4e-ovnkube-config\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563012 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.562972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-etc-selinux\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.563174 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563021 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdpvz\" (UniqueName: \"kubernetes.io/projected/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-kube-api-access-vdpvz\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.563174 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563064 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-kubelet\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563174 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-var-lib-cni-multus\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563174 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-conf-dir\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563174 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563154 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xktnr\" (UniqueName: \"kubernetes.io/projected/06811185-7a8c-419b-9f44-d67b67d794d3-kube-api-access-xktnr\") pod \"node-ca-lqhrs\" (UID: \"06811185-7a8c-419b-9f44-d67b67d794d3\") " pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.563401 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563193 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563401 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563222 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-cni-dir\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563401 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563243 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:09.563401 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563257 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-cnibin\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563401 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563280 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-var-lib-kubelet\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563401 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563312 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-daemon-config\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563401 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2vz2\" (UniqueName: \"kubernetes.io/projected/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-kube-api-access-q2vz2\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563401 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-run-openvswitch\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vdhr\" (UniqueName: \"kubernetes.io/projected/1fa09637-267c-4a4b-8aac-54287c81cc4e-kube-api-access-5vdhr\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563431 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-cni-binary-copy\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563465 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-registration-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563483 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-systemd-units\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563518 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-node-log\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563534 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-run-ovn-kubernetes\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-cni-netd\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563596 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb7xt\" (UniqueName: \"kubernetes.io/projected/e8a849a2-893a-4e45-b82e-22ee8ac74d6e-kube-api-access-pb7xt\") pod \"iptables-alerter-npwhk\" (UID: \"e8a849a2-893a-4e45-b82e-22ee8ac74d6e\") " pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563615 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1fa09637-267c-4a4b-8aac-54287c81cc4e-env-overrides\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563636 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fa09637-267c-4a4b-8aac-54287c81cc4e-ovn-node-metrics-cert\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563668 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-run-multus-certs\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-run-systemd\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563719 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-socket-dir-parent\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.563757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563738 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e8a849a2-893a-4e45-b82e-22ee8ac74d6e-iptables-alerter-script\") pod \"iptables-alerter-npwhk\" (UID: \"e8a849a2-893a-4e45-b82e-22ee8ac74d6e\") " pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-slash\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563798 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt29k\" (UniqueName: \"kubernetes.io/projected/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-kube-api-access-zt29k\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1fa09637-267c-4a4b-8aac-54287c81cc4e-ovnkube-script-lib\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-run-k8s-cni-cncf-io\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563918 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-var-lib-cni-bin\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563945 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-hostroot\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.563975 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-etc-kubernetes\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564006 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-run-ovn\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-run-netns\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-log-socket\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564110 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-socket-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564146 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-device-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564166 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-sys-fs\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/06811185-7a8c-419b-9f44-d67b67d794d3-serviceca\") pod \"node-ca-lqhrs\" (UID: \"06811185-7a8c-419b-9f44-d67b67d794d3\") " pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564221 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-run-netns\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564244 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/670fcc7b-8343-46ca-b1b5-00040742a8e8-konnectivity-ca\") pod \"konnectivity-agent-5pxkv\" (UID: \"670fcc7b-8343-46ca-b1b5-00040742a8e8\") " pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:09.564400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.564268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-system-cni-dir\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.567383 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.567365 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:09.589044 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.589023 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:08 +0000 UTC" deadline="2027-12-22 16:24:29.391589483 +0000 UTC" Apr 17 11:16:09.589044 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.589043 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14741h8m19.802548954s" Apr 17 11:16:09.653579 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.653554 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:09.664886 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.664858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-run-netns\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665001 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.664894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/670fcc7b-8343-46ca-b1b5-00040742a8e8-konnectivity-ca\") pod \"konnectivity-agent-5pxkv\" (UID: \"670fcc7b-8343-46ca-b1b5-00040742a8e8\") " pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:09.665001 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.664919 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-system-cni-dir\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.665001 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.664934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-os-release\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.665001 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.664957 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-sysctl-d\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.665001 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.664982 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-run-netns\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665018 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-system-cni-dir\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665031 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-os-release\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665041 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665071 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a61e6396-9d01-4767-84e7-6240ed2764cc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665099 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e8a849a2-893a-4e45-b82e-22ee8ac74d6e-host-slash\") pod \"iptables-alerter-npwhk\" (UID: \"e8a849a2-893a-4e45-b82e-22ee8ac74d6e\") " pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-sys\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665169 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06811185-7a8c-419b-9f44-d67b67d794d3-host\") pod \"node-ca-lqhrs\" (UID: \"06811185-7a8c-419b-9f44-d67b67d794d3\") " pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e8a849a2-893a-4e45-b82e-22ee8ac74d6e-host-slash\") pod \"iptables-alerter-npwhk\" (UID: \"e8a849a2-893a-4e45-b82e-22ee8ac74d6e\") " pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665193 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-var-lib-openvswitch\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/670fcc7b-8343-46ca-b1b5-00040742a8e8-agent-certs\") pod \"konnectivity-agent-5pxkv\" (UID: \"670fcc7b-8343-46ca-b1b5-00040742a8e8\") " pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665241 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06811185-7a8c-419b-9f44-d67b67d794d3-host\") pod \"node-ca-lqhrs\" (UID: \"06811185-7a8c-419b-9f44-d67b67d794d3\") " pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-var-lib-openvswitch\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665250 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665312 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.665332 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-systemd\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665372 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2r7c\" (UniqueName: \"kubernetes.io/projected/88f53a5c-9a8b-457b-9e6e-e62bf112bbb8-kube-api-access-c2r7c\") pod \"node-resolver-k8djp\" (UID: \"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8\") " pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665482 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/670fcc7b-8343-46ca-b1b5-00040742a8e8-konnectivity-ca\") pod \"konnectivity-agent-5pxkv\" (UID: \"670fcc7b-8343-46ca-b1b5-00040742a8e8\") " pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665494 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-etc-openvswitch\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665538 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-cni-bin\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665564 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1fa09637-267c-4a4b-8aac-54287c81cc4e-ovnkube-config\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-etc-openvswitch\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665587 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-etc-selinux\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665597 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-cni-bin\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665598 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdpvz\" (UniqueName: \"kubernetes.io/projected/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-kube-api-access-vdpvz\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-sysctl-conf\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665710 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-tuned\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665711 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-etc-selinux\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-kubelet\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-var-lib-cni-multus\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.665969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-conf-dir\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-var-lib-cni-multus\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665776 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-kubelet\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665822 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-modprobe-d\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-conf-dir\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-var-lib-kubelet\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-os-release\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665903 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xktnr\" (UniqueName: \"kubernetes.io/projected/06811185-7a8c-419b-9f44-d67b67d794d3-kube-api-access-xktnr\") pod \"node-ca-lqhrs\" (UID: \"06811185-7a8c-419b-9f44-d67b67d794d3\") " pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-cni-dir\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.665981 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-lib-modules\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666007 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-system-cni-dir\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666034 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666058 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-cnibin\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666074 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-cni-dir\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-var-lib-kubelet\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.666639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666095 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1fa09637-267c-4a4b-8aac-54287c81cc4e-ovnkube-config\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666136 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-daemon-config\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666146 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-var-lib-kubelet\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666160 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-cnibin\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2vz2\" (UniqueName: \"kubernetes.io/projected/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-kube-api-access-q2vz2\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:09.666182 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666213 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-sysconfig\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-run-openvswitch\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:09.666273 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs podName:39a4801b-dc46-42c1-a6fe-8a2d79362e6b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:10.166238381 +0000 UTC m=+3.082763305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs") pod "network-metrics-daemon-6zdmq" (UID: "39a4801b-dc46-42c1-a6fe-8a2d79362e6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666278 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-run-openvswitch\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vdhr\" (UniqueName: \"kubernetes.io/projected/1fa09637-267c-4a4b-8aac-54287c81cc4e-kube-api-access-5vdhr\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666341 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-cni-binary-copy\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-registration-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666397 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f1cd366b-5311-44d3-af2a-8b067cf4f65a-tmp\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666420 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvsqh\" (UniqueName: \"kubernetes.io/projected/a61e6396-9d01-4767-84e7-6240ed2764cc-kube-api-access-cvsqh\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-systemd-units\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-node-log\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.667438 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666502 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-registration-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-run-ovn-kubernetes\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666542 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-cni-netd\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666567 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-node-log\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-run-ovn-kubernetes\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a61e6396-9d01-4767-84e7-6240ed2764cc-cni-binary-copy\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666609 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-cni-netd\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666619 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-systemd-units\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666624 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/88f53a5c-9a8b-457b-9e6e-e62bf112bbb8-hosts-file\") pod \"node-resolver-k8djp\" (UID: \"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8\") " pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-daemon-config\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/88f53a5c-9a8b-457b-9e6e-e62bf112bbb8-tmp-dir\") pod \"node-resolver-k8djp\" (UID: \"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8\") " pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666676 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb7xt\" (UniqueName: \"kubernetes.io/projected/e8a849a2-893a-4e45-b82e-22ee8ac74d6e-kube-api-access-pb7xt\") pod \"iptables-alerter-npwhk\" (UID: \"e8a849a2-893a-4e45-b82e-22ee8ac74d6e\") " pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1fa09637-267c-4a4b-8aac-54287c81cc4e-env-overrides\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666723 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fa09637-267c-4a4b-8aac-54287c81cc4e-ovn-node-metrics-cert\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-run-multus-certs\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666770 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-host\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666774 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-cni-binary-copy\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668280 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666794 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-run-systemd\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666821 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-socket-dir-parent\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-run\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lpkh\" (UniqueName: \"kubernetes.io/projected/f1cd366b-5311-44d3-af2a-8b067cf4f65a-kube-api-access-6lpkh\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666879 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-cnibin\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666919 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-run-systemd\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666928 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-multus-socket-dir-parent\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e8a849a2-893a-4e45-b82e-22ee8ac74d6e-iptables-alerter-script\") pod \"iptables-alerter-npwhk\" (UID: \"e8a849a2-893a-4e45-b82e-22ee8ac74d6e\") " pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666966 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-run-multus-certs\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.666980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-slash\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zt29k\" (UniqueName: \"kubernetes.io/projected/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-kube-api-access-zt29k\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a61e6396-9d01-4767-84e7-6240ed2764cc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1fa09637-267c-4a4b-8aac-54287c81cc4e-ovnkube-script-lib\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-run-k8s-cni-cncf-io\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667149 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-var-lib-cni-bin\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-hostroot\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667201 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-etc-kubernetes\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.668990 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-run-ovn\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-run-netns\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-log-socket\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667312 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-socket-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667058 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-host-slash\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-device-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667351 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1fa09637-267c-4a4b-8aac-54287c81cc4e-env-overrides\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667312 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-var-lib-cni-bin\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667404 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-sys-fs\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667424 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-device-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-kubernetes\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-run-netns\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667480 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-log-socket\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-hostroot\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667523 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e8a849a2-893a-4e45-b82e-22ee8ac74d6e-iptables-alerter-script\") pod \"iptables-alerter-npwhk\" (UID: \"e8a849a2-893a-4e45-b82e-22ee8ac74d6e\") " pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/06811185-7a8c-419b-9f44-d67b67d794d3-serviceca\") pod \"node-ca-lqhrs\" (UID: \"06811185-7a8c-419b-9f44-d67b67d794d3\") " pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667564 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-socket-dir\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667605 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-etc-kubernetes\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.669619 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1fa09637-267c-4a4b-8aac-54287c81cc4e-run-ovn\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.670205 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667674 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-sys-fs\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.670205 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667727 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-host-run-k8s-cni-cncf-io\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.670205 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1fa09637-267c-4a4b-8aac-54287c81cc4e-ovnkube-script-lib\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.670205 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.667998 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/06811185-7a8c-419b-9f44-d67b67d794d3-serviceca\") pod \"node-ca-lqhrs\" (UID: \"06811185-7a8c-419b-9f44-d67b67d794d3\") " pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.670205 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.669242 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fa09637-267c-4a4b-8aac-54287c81cc4e-ovn-node-metrics-cert\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.670205 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.669442 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/670fcc7b-8343-46ca-b1b5-00040742a8e8-agent-certs\") pod \"konnectivity-agent-5pxkv\" (UID: \"670fcc7b-8343-46ca-b1b5-00040742a8e8\") " pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:09.679562 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:09.679437 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:09.679562 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:09.679462 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:09.679562 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:09.679476 2571 projected.go:194] Error preparing data for projected volume kube-api-access-l8wvn for pod openshift-network-diagnostics/network-check-target-w7npk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:09.679802 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:09.679680 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn podName:c41a7def-7809-48c2-80fa-7078299705ca nodeName:}" failed. No retries permitted until 2026-04-17 11:16:10.179664233 +0000 UTC m=+3.096189153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l8wvn" (UniqueName: "kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn") pod "network-check-target-w7npk" (UID: "c41a7def-7809-48c2-80fa-7078299705ca") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:09.682515 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.682488 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdpvz\" (UniqueName: \"kubernetes.io/projected/dec438d4-4f5d-4e28-b4c4-3f1139b211ff-kube-api-access-vdpvz\") pod \"aws-ebs-csi-driver-node-7g74t\" (UID: \"dec438d4-4f5d-4e28-b4c4-3f1139b211ff\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.682515 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.682505 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2vz2\" (UniqueName: \"kubernetes.io/projected/7a6e582c-8fc4-4d48-a9f1-63fa4e09787a-kube-api-access-q2vz2\") pod \"multus-zc47d\" (UID: \"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a\") " pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.682653 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.682497 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb7xt\" (UniqueName: \"kubernetes.io/projected/e8a849a2-893a-4e45-b82e-22ee8ac74d6e-kube-api-access-pb7xt\") pod \"iptables-alerter-npwhk\" (UID: \"e8a849a2-893a-4e45-b82e-22ee8ac74d6e\") " pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.685681 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.685666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xktnr\" (UniqueName: \"kubernetes.io/projected/06811185-7a8c-419b-9f44-d67b67d794d3-kube-api-access-xktnr\") pod \"node-ca-lqhrs\" (UID: \"06811185-7a8c-419b-9f44-d67b67d794d3\") " pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.685762 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.685704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt29k\" (UniqueName: \"kubernetes.io/projected/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-kube-api-access-zt29k\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:09.686660 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.686633 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vdhr\" (UniqueName: \"kubernetes.io/projected/1fa09637-267c-4a4b-8aac-54287c81cc4e-kube-api-access-5vdhr\") pod \"ovnkube-node-xmrcg\" (UID: \"1fa09637-267c-4a4b-8aac-54287c81cc4e\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.768021 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.767991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-sysctl-d\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768021 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768027 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.768247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768052 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a61e6396-9d01-4767-84e7-6240ed2764cc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.768247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-sys\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768155 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-sys\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-systemd\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768198 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-sysctl-d\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2r7c\" (UniqueName: \"kubernetes.io/projected/88f53a5c-9a8b-457b-9e6e-e62bf112bbb8-kube-api-access-c2r7c\") pod \"node-resolver-k8djp\" (UID: \"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8\") " pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.768247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.768247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768249 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-sysctl-conf\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-systemd\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768276 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-tuned\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768302 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-modprobe-d\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768322 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-var-lib-kubelet\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768415 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-modprobe-d\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-os-release\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-sysctl-conf\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768461 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-lib-modules\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-os-release\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768495 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-var-lib-kubelet\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768487 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-system-cni-dir\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-system-cni-dir\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-sysconfig\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768562 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-lib-modules\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768581 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f1cd366b-5311-44d3-af2a-8b067cf4f65a-tmp\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.768611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768608 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-sysconfig\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768628 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a61e6396-9d01-4767-84e7-6240ed2764cc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvsqh\" (UniqueName: \"kubernetes.io/projected/a61e6396-9d01-4767-84e7-6240ed2764cc-kube-api-access-cvsqh\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768693 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a61e6396-9d01-4767-84e7-6240ed2764cc-cni-binary-copy\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/88f53a5c-9a8b-457b-9e6e-e62bf112bbb8-hosts-file\") pod \"node-resolver-k8djp\" (UID: \"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8\") " pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/88f53a5c-9a8b-457b-9e6e-e62bf112bbb8-tmp-dir\") pod \"node-resolver-k8djp\" (UID: \"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8\") " pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768800 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-host\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768822 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-run\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lpkh\" (UniqueName: \"kubernetes.io/projected/f1cd366b-5311-44d3-af2a-8b067cf4f65a-kube-api-access-6lpkh\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-cnibin\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/88f53a5c-9a8b-457b-9e6e-e62bf112bbb8-hosts-file\") pod \"node-resolver-k8djp\" (UID: \"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8\") " pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a61e6396-9d01-4767-84e7-6240ed2764cc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-host\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.768941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-kubernetes\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.769008 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a61e6396-9d01-4767-84e7-6240ed2764cc-cnibin\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.769013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-kubernetes\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.769055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1cd366b-5311-44d3-af2a-8b067cf4f65a-run\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.770015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.769113 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/88f53a5c-9a8b-457b-9e6e-e62bf112bbb8-tmp-dir\") pod \"node-resolver-k8djp\" (UID: \"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8\") " pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.770865 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.769219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a61e6396-9d01-4767-84e7-6240ed2764cc-cni-binary-copy\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.770865 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.769883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a61e6396-9d01-4767-84e7-6240ed2764cc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.771872 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.771853 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f1cd366b-5311-44d3-af2a-8b067cf4f65a-etc-tuned\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.771938 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.771876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f1cd366b-5311-44d3-af2a-8b067cf4f65a-tmp\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.776718 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.776682 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2r7c\" (UniqueName: \"kubernetes.io/projected/88f53a5c-9a8b-457b-9e6e-e62bf112bbb8-kube-api-access-c2r7c\") pod \"node-resolver-k8djp\" (UID: \"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8\") " pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.777039 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.777020 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lpkh\" (UniqueName: \"kubernetes.io/projected/f1cd366b-5311-44d3-af2a-8b067cf4f65a-kube-api-access-6lpkh\") pod \"tuned-gh4f5\" (UID: \"f1cd366b-5311-44d3-af2a-8b067cf4f65a\") " pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.777112 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.777045 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvsqh\" (UniqueName: \"kubernetes.io/projected/a61e6396-9d01-4767-84e7-6240ed2764cc-kube-api-access-cvsqh\") pod \"multus-additional-cni-plugins-d76ls\" (UID: \"a61e6396-9d01-4767-84e7-6240ed2764cc\") " pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.855061 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.854999 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lqhrs" Apr 17 11:16:09.862729 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.862709 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-npwhk" Apr 17 11:16:09.870193 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.870173 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:09.875741 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.875726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:09.882737 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.882723 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" Apr 17 11:16:09.890305 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.890277 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zc47d" Apr 17 11:16:09.897856 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.897839 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" Apr 17 11:16:09.904363 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.904346 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k8djp" Apr 17 11:16:09.907833 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.907819 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d76ls" Apr 17 11:16:09.982302 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:09.982280 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:10.143939 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:10.143902 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda61e6396_9d01_4767_84e7_6240ed2764cc.slice/crio-c9786e179c53b88e0c6b34334e28a9582076171a47e557f23db040a6e5d8c2a1 WatchSource:0}: Error finding container c9786e179c53b88e0c6b34334e28a9582076171a47e557f23db040a6e5d8c2a1: Status 404 returned error can't find the container with id c9786e179c53b88e0c6b34334e28a9582076171a47e557f23db040a6e5d8c2a1 Apr 17 11:16:10.145648 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:10.145611 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a849a2_893a_4e45_b82e_22ee8ac74d6e.slice/crio-f9bcfac5c6c47995b5f51e8c3893ced161d53c997c1f3c9aee1067a5fa26450a WatchSource:0}: Error finding container f9bcfac5c6c47995b5f51e8c3893ced161d53c997c1f3c9aee1067a5fa26450a: Status 404 returned error can't find the container with id f9bcfac5c6c47995b5f51e8c3893ced161d53c997c1f3c9aee1067a5fa26450a Apr 17 11:16:10.146930 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:10.146831 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6e582c_8fc4_4d48_a9f1_63fa4e09787a.slice/crio-7c30d25c8639aafdfdc9c6b4990f5f9894a0b299ceb1b5d5f1ba0fbdf9197056 WatchSource:0}: Error finding container 7c30d25c8639aafdfdc9c6b4990f5f9894a0b299ceb1b5d5f1ba0fbdf9197056: Status 404 returned error can't find the container with id 7c30d25c8639aafdfdc9c6b4990f5f9894a0b299ceb1b5d5f1ba0fbdf9197056 Apr 17 11:16:10.148887 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:10.148072 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1cd366b_5311_44d3_af2a_8b067cf4f65a.slice/crio-90608d69b27e005c6102f8b6be3916a0907dca46b1eee7e0229a74e58b28fad7 WatchSource:0}: Error finding container 90608d69b27e005c6102f8b6be3916a0907dca46b1eee7e0229a74e58b28fad7: Status 404 returned error can't find the container with id 90608d69b27e005c6102f8b6be3916a0907dca46b1eee7e0229a74e58b28fad7 Apr 17 11:16:10.149961 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:10.149864 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06811185_7a8c_419b_9f44_d67b67d794d3.slice/crio-180ff19b478e8dbed9b4cc7593311b9f6d1c4174863a3c4455afda7070025ea6 WatchSource:0}: Error finding container 180ff19b478e8dbed9b4cc7593311b9f6d1c4174863a3c4455afda7070025ea6: Status 404 returned error can't find the container with id 180ff19b478e8dbed9b4cc7593311b9f6d1c4174863a3c4455afda7070025ea6 Apr 17 11:16:10.150640 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:10.150621 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f53a5c_9a8b_457b_9e6e_e62bf112bbb8.slice/crio-1f5e7b5ba5e4ea6eef1769f7ca00929af2e435532fd6d281fbb9e2959fff80b3 WatchSource:0}: Error finding container 1f5e7b5ba5e4ea6eef1769f7ca00929af2e435532fd6d281fbb9e2959fff80b3: Status 404 returned error can't find the container with id 1f5e7b5ba5e4ea6eef1769f7ca00929af2e435532fd6d281fbb9e2959fff80b3 Apr 17 11:16:10.151898 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:10.151630 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa09637_267c_4a4b_8aac_54287c81cc4e.slice/crio-4180fa0a6e13cb16e1c341ac47dfe5c1ab0be460f8264e5fd775d4a181fce76a WatchSource:0}: Error finding container 4180fa0a6e13cb16e1c341ac47dfe5c1ab0be460f8264e5fd775d4a181fce76a: Status 404 returned error can't find the container with id 4180fa0a6e13cb16e1c341ac47dfe5c1ab0be460f8264e5fd775d4a181fce76a Apr 17 11:16:10.152397 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:10.152382 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod670fcc7b_8343_46ca_b1b5_00040742a8e8.slice/crio-ccd8a223c256cba15b8fcc0491b5d38348fcb8d646a8d0969462592e1f1484a3 WatchSource:0}: Error finding container ccd8a223c256cba15b8fcc0491b5d38348fcb8d646a8d0969462592e1f1484a3: Status 404 returned error can't find the container with id ccd8a223c256cba15b8fcc0491b5d38348fcb8d646a8d0969462592e1f1484a3 Apr 17 11:16:10.154248 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:10.153532 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddec438d4_4f5d_4e28_b4c4_3f1139b211ff.slice/crio-fd63dc37373e7c71e7c3c67d41a56a9f8849d5c92ba186c761ee6b0e74285ac0 WatchSource:0}: Error finding container fd63dc37373e7c71e7c3c67d41a56a9f8849d5c92ba186c761ee6b0e74285ac0: Status 404 returned error can't find the container with id fd63dc37373e7c71e7c3c67d41a56a9f8849d5c92ba186c761ee6b0e74285ac0 Apr 17 11:16:10.171263 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.171108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:10.171326 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:10.171219 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:10.171371 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:10.171363 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs podName:39a4801b-dc46-42c1-a6fe-8a2d79362e6b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.171346731 +0000 UTC m=+4.087871636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs") pod "network-metrics-daemon-6zdmq" (UID: "39a4801b-dc46-42c1-a6fe-8a2d79362e6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:10.272182 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.272161 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:10.272302 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:10.272289 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:10.272343 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:10.272306 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:10.272343 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:10.272314 2571 projected.go:194] Error preparing data for projected volume kube-api-access-l8wvn for pod openshift-network-diagnostics/network-check-target-w7npk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:10.272418 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:10.272355 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn podName:c41a7def-7809-48c2-80fa-7078299705ca nodeName:}" failed. No retries permitted until 2026-04-17 11:16:11.272341562 +0000 UTC m=+4.188866466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l8wvn" (UniqueName: "kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn") pod "network-check-target-w7npk" (UID: "c41a7def-7809-48c2-80fa-7078299705ca") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:10.590015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.589885 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:08 +0000 UTC" deadline="2027-11-15 22:23:15.925425936 +0000 UTC" Apr 17 11:16:10.590015 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.589919 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13859h7m5.335510655s" Apr 17 11:16:10.683508 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.683055 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:10.683508 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:10.683179 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:10.693693 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.693339 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zc47d" event={"ID":"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a","Type":"ContainerStarted","Data":"7c30d25c8639aafdfdc9c6b4990f5f9894a0b299ceb1b5d5f1ba0fbdf9197056"} Apr 17 11:16:10.694530 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.694505 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lqhrs" event={"ID":"06811185-7a8c-419b-9f44-d67b67d794d3","Type":"ContainerStarted","Data":"180ff19b478e8dbed9b4cc7593311b9f6d1c4174863a3c4455afda7070025ea6"} Apr 17 11:16:10.699309 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.699284 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" event={"ID":"f1cd366b-5311-44d3-af2a-8b067cf4f65a","Type":"ContainerStarted","Data":"90608d69b27e005c6102f8b6be3916a0907dca46b1eee7e0229a74e58b28fad7"} Apr 17 11:16:10.702004 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.701956 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerStarted","Data":"4180fa0a6e13cb16e1c341ac47dfe5c1ab0be460f8264e5fd775d4a181fce76a"} Apr 17 11:16:10.707761 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.707718 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k8djp" event={"ID":"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8","Type":"ContainerStarted","Data":"1f5e7b5ba5e4ea6eef1769f7ca00929af2e435532fd6d281fbb9e2959fff80b3"} Apr 17 11:16:10.718173 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.718109 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-npwhk" event={"ID":"e8a849a2-893a-4e45-b82e-22ee8ac74d6e","Type":"ContainerStarted","Data":"f9bcfac5c6c47995b5f51e8c3893ced161d53c997c1f3c9aee1067a5fa26450a"} Apr 17 11:16:10.721283 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.721242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d76ls" event={"ID":"a61e6396-9d01-4767-84e7-6240ed2764cc","Type":"ContainerStarted","Data":"c9786e179c53b88e0c6b34334e28a9582076171a47e557f23db040a6e5d8c2a1"} Apr 17 11:16:10.728593 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.728425 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" event={"ID":"237efac7542ae805317afa8331e5e27b","Type":"ContainerStarted","Data":"c0b1770e91abeac4028812264a7dd639ece1630b68e9b308f1586cf9b65c5773"} Apr 17 11:16:10.737702 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.737676 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" event={"ID":"dec438d4-4f5d-4e28-b4c4-3f1139b211ff","Type":"ContainerStarted","Data":"fd63dc37373e7c71e7c3c67d41a56a9f8849d5c92ba186c761ee6b0e74285ac0"} Apr 17 11:16:10.740724 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.740702 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5pxkv" event={"ID":"670fcc7b-8343-46ca-b1b5-00040742a8e8","Type":"ContainerStarted","Data":"ccd8a223c256cba15b8fcc0491b5d38348fcb8d646a8d0969462592e1f1484a3"} Apr 17 11:16:10.751930 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:10.751874 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-16.ec2.internal" podStartSLOduration=2.75186127 podStartE2EDuration="2.75186127s" podCreationTimestamp="2026-04-17 11:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:10.745731473 +0000 UTC m=+3.662256397" watchObservedRunningTime="2026-04-17 11:16:10.75186127 +0000 UTC m=+3.668386198" Apr 17 11:16:11.179757 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:11.179258 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:11.179757 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:11.179397 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:11.179757 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:11.179450 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs podName:39a4801b-dc46-42c1-a6fe-8a2d79362e6b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.17943251 +0000 UTC m=+6.095957428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs") pod "network-metrics-daemon-6zdmq" (UID: "39a4801b-dc46-42c1-a6fe-8a2d79362e6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:11.280000 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:11.279968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:11.280222 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:11.280197 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:11.280222 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:11.280222 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:11.280345 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:11.280235 2571 projected.go:194] Error preparing data for projected volume kube-api-access-l8wvn for pod openshift-network-diagnostics/network-check-target-w7npk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:11.280345 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:11.280292 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn podName:c41a7def-7809-48c2-80fa-7078299705ca nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.280273903 +0000 UTC m=+6.196798813 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l8wvn" (UniqueName: "kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn") pod "network-check-target-w7npk" (UID: "c41a7def-7809-48c2-80fa-7078299705ca") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:11.682878 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:11.682796 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:11.683290 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:11.682939 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:11.768824 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:11.768787 2571 generic.go:358] "Generic (PLEG): container finished" podID="27d97c8c240a436d06b1c4f45cd224be" containerID="1e0f3a32782082b3240c2618f9c09296acbb46d205bb483f05ceab785424546c" exitCode=0 Apr 17 11:16:11.769329 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:11.769290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" event={"ID":"27d97c8c240a436d06b1c4f45cd224be","Type":"ContainerDied","Data":"1e0f3a32782082b3240c2618f9c09296acbb46d205bb483f05ceab785424546c"} Apr 17 11:16:12.682484 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:12.682038 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:12.682484 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:12.682161 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:12.778551 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:12.778519 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" event={"ID":"27d97c8c240a436d06b1c4f45cd224be","Type":"ContainerStarted","Data":"20c81ed4f0bd0e941f3cdd4c45100619c919673d713eabd92d37cfb4549de35d"} Apr 17 11:16:13.197507 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:13.197393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:13.197666 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:13.197540 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.197666 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:13.197606 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs podName:39a4801b-dc46-42c1-a6fe-8a2d79362e6b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:17.19758823 +0000 UTC m=+10.114113149 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs") pod "network-metrics-daemon-6zdmq" (UID: "39a4801b-dc46-42c1-a6fe-8a2d79362e6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.298689 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:13.298655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:13.298854 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:13.298832 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:13.298854 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:13.298852 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:13.298965 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:13.298867 2571 projected.go:194] Error preparing data for projected volume kube-api-access-l8wvn for pod openshift-network-diagnostics/network-check-target-w7npk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.298965 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:13.298929 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn podName:c41a7def-7809-48c2-80fa-7078299705ca nodeName:}" failed. No retries permitted until 2026-04-17 11:16:17.298910893 +0000 UTC m=+10.215435800 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l8wvn" (UniqueName: "kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn") pod "network-check-target-w7npk" (UID: "c41a7def-7809-48c2-80fa-7078299705ca") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.682768 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:13.682243 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:13.682768 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:13.682386 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:14.682240 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:14.682187 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:14.682676 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:14.682378 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:15.682396 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:15.681894 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:15.682396 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:15.682028 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:16.681936 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:16.681898 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:16.682102 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:16.682019 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:17.036992 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.036882 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-16.ec2.internal" podStartSLOduration=9.036864256 podStartE2EDuration="9.036864256s" podCreationTimestamp="2026-04-17 11:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:12.793544462 +0000 UTC m=+5.710069389" watchObservedRunningTime="2026-04-17 11:16:17.036864256 +0000 UTC m=+9.953389183" Apr 17 11:16:17.037766 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.037748 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-n25xj"] Apr 17 11:16:17.040383 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.040361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.040486 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.040441 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:17.130641 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.130585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75fcf97e-549f-48be-9a3a-142bc6a20eaa-kubelet-config\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.130641 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.130640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.130871 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.130746 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75fcf97e-549f-48be-9a3a-142bc6a20eaa-dbus\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.232147 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.232005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.232147 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.232090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75fcf97e-549f-48be-9a3a-142bc6a20eaa-dbus\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.232147 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.232138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:17.232400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.232189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75fcf97e-549f-48be-9a3a-142bc6a20eaa-kubelet-config\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.232400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.232277 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75fcf97e-549f-48be-9a3a-142bc6a20eaa-kubelet-config\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.232400 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.232375 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75fcf97e-549f-48be-9a3a-142bc6a20eaa-dbus\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.232533 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.232382 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:17.232533 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.232432 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:17.232533 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.232469 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs podName:39a4801b-dc46-42c1-a6fe-8a2d79362e6b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:25.232449079 +0000 UTC m=+18.148973997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs") pod "network-metrics-daemon-6zdmq" (UID: "39a4801b-dc46-42c1-a6fe-8a2d79362e6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:17.232533 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.232488 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret podName:75fcf97e-549f-48be-9a3a-142bc6a20eaa nodeName:}" failed. No retries permitted until 2026-04-17 11:16:17.73247869 +0000 UTC m=+10.649003609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret") pod "global-pull-secret-syncer-n25xj" (UID: "75fcf97e-549f-48be-9a3a-142bc6a20eaa") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:17.333492 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.333398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:17.333638 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.333577 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:17.333638 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.333599 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:17.333638 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.333611 2571 projected.go:194] Error preparing data for projected volume kube-api-access-l8wvn for pod openshift-network-diagnostics/network-check-target-w7npk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:17.333787 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.333673 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn podName:c41a7def-7809-48c2-80fa-7078299705ca nodeName:}" failed. No retries permitted until 2026-04-17 11:16:25.333654163 +0000 UTC m=+18.250179082 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l8wvn" (UniqueName: "kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn") pod "network-check-target-w7npk" (UID: "c41a7def-7809-48c2-80fa-7078299705ca") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:17.682707 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.682629 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:17.682859 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.682746 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:17.737890 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:17.737852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:17.738081 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.738064 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:17.738197 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:17.738175 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret podName:75fcf97e-549f-48be-9a3a-142bc6a20eaa nodeName:}" failed. No retries permitted until 2026-04-17 11:16:18.73815529 +0000 UTC m=+11.654680212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret") pod "global-pull-secret-syncer-n25xj" (UID: "75fcf97e-549f-48be-9a3a-142bc6a20eaa") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:18.681722 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:18.681691 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:18.682173 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:18.681694 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:18.682173 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:18.681793 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:18.682173 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:18.681901 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:18.745903 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:18.745877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:18.746052 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:18.745980 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:18.746052 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:18.746032 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret podName:75fcf97e-549f-48be-9a3a-142bc6a20eaa nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.74601609 +0000 UTC m=+13.662541011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret") pod "global-pull-secret-syncer-n25xj" (UID: "75fcf97e-549f-48be-9a3a-142bc6a20eaa") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:19.682545 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:19.682513 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:19.683000 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:19.682653 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:20.681902 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:20.681870 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:20.682079 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:20.681870 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:20.682079 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:20.681972 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:20.682079 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:20.682057 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:20.759918 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:20.759889 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:20.760356 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:20.760063 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:20.760356 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:20.760145 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret podName:75fcf97e-549f-48be-9a3a-142bc6a20eaa nodeName:}" failed. No retries permitted until 2026-04-17 11:16:24.760111022 +0000 UTC m=+17.676635927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret") pod "global-pull-secret-syncer-n25xj" (UID: "75fcf97e-549f-48be-9a3a-142bc6a20eaa") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:21.682366 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:21.682328 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:21.682527 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:21.682465 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:22.681769 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:22.681741 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:22.682167 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:22.681851 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:22.682167 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:22.681750 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:22.682167 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:22.682006 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:23.682187 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:23.682154 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:23.682612 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:23.682269 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:24.682338 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:24.682307 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:24.682872 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:24.682308 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:24.682872 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:24.682463 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:24.682872 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:24.682577 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:24.792348 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:24.792321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:24.792522 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:24.792453 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:24.792585 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:24.792522 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret podName:75fcf97e-549f-48be-9a3a-142bc6a20eaa nodeName:}" failed. No retries permitted until 2026-04-17 11:16:32.79250225 +0000 UTC m=+25.709027162 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret") pod "global-pull-secret-syncer-n25xj" (UID: "75fcf97e-549f-48be-9a3a-142bc6a20eaa") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:25.297550 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:25.297520 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:25.297728 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:25.297618 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:25.297728 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:25.297672 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs podName:39a4801b-dc46-42c1-a6fe-8a2d79362e6b nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.297653311 +0000 UTC m=+34.214178215 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs") pod "network-metrics-daemon-6zdmq" (UID: "39a4801b-dc46-42c1-a6fe-8a2d79362e6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:25.398324 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:25.398288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:25.398461 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:25.398442 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:25.398514 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:25.398467 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:25.398514 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:25.398477 2571 projected.go:194] Error preparing data for projected volume kube-api-access-l8wvn for pod openshift-network-diagnostics/network-check-target-w7npk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:25.398595 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:25.398525 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn podName:c41a7def-7809-48c2-80fa-7078299705ca nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.39851302 +0000 UTC m=+34.315037924 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l8wvn" (UniqueName: "kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn") pod "network-check-target-w7npk" (UID: "c41a7def-7809-48c2-80fa-7078299705ca") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:25.682200 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:25.682135 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:25.682337 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:25.682245 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:26.682340 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:26.682245 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:26.682340 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:26.682258 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:26.682690 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:26.682360 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:26.682690 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:26.682446 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:26.807571 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:26.807368 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5pxkv" event={"ID":"670fcc7b-8343-46ca-b1b5-00040742a8e8","Type":"ContainerStarted","Data":"23026c8727065c13701625a0f717cdbd3512af36ab73128072e773e3c1722177"} Apr 17 11:16:27.683376 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.683177 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:27.683897 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:27.683478 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:27.818057 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.818005 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lqhrs" event={"ID":"06811185-7a8c-419b-9f44-d67b67d794d3","Type":"ContainerStarted","Data":"2b762e95d3b8bbd6d4e1071d5b2c526e678f58adb934cf44c86ffb8279690e63"} Apr 17 11:16:27.819527 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.819483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" event={"ID":"f1cd366b-5311-44d3-af2a-8b067cf4f65a","Type":"ContainerStarted","Data":"a0d8555a64f6738a0305a426ab2decb1659b7124426cb36749c2f408d119e82d"} Apr 17 11:16:27.822237 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.822214 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/0.log" Apr 17 11:16:27.822559 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.822509 2571 generic.go:358] "Generic (PLEG): container finished" podID="1fa09637-267c-4a4b-8aac-54287c81cc4e" containerID="9c2b3f7ec7f1100b85def5dbed45d8320df1dbe9560ee5984bd91ae790107161" exitCode=1 Apr 17 11:16:27.822650 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.822588 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerStarted","Data":"87c580b518148f0758ec4d1ae63d7d06f25e8e9b579bc0ecfcceb9dfbb661e13"} Apr 17 11:16:27.822650 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.822624 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerStarted","Data":"eff0f92584cfe276fb703fff4d8189c8f4c7ec3ebb6aca934c6ac46f0ad0a0d4"} Apr 17 11:16:27.822650 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.822637 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerStarted","Data":"65b11559b80f6b37f72193cd4bcb6b921a365087ecab15483fbfd20197e529ea"} Apr 17 11:16:27.822650 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.822648 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerStarted","Data":"51acd850f48a862872f7748932644ac8023408c30f4339a3fe40d04c01b31b7e"} Apr 17 11:16:27.822839 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.822662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerDied","Data":"9c2b3f7ec7f1100b85def5dbed45d8320df1dbe9560ee5984bd91ae790107161"} Apr 17 11:16:27.822839 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.822676 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerStarted","Data":"9501ae3e1648452ca06e0ce94fb272165be73c5a71350a662c8049abbe24ab20"} Apr 17 11:16:27.823876 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.823854 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k8djp" event={"ID":"88f53a5c-9a8b-457b-9e6e-e62bf112bbb8","Type":"ContainerStarted","Data":"1adb4157c0754f21cc015caafb8189356b18b6897bf4f1ed6147a9488d5588bc"} Apr 17 11:16:27.825425 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.825353 2571 generic.go:358] "Generic (PLEG): container finished" podID="a61e6396-9d01-4767-84e7-6240ed2764cc" containerID="0b21b721a931e686b8d1a559921f2e93706f5aad0f0dff7424b81948b94ff324" exitCode=0 Apr 17 11:16:27.825425 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.825386 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d76ls" event={"ID":"a61e6396-9d01-4767-84e7-6240ed2764cc","Type":"ContainerDied","Data":"0b21b721a931e686b8d1a559921f2e93706f5aad0f0dff7424b81948b94ff324"} Apr 17 11:16:27.826842 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.826817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" event={"ID":"dec438d4-4f5d-4e28-b4c4-3f1139b211ff","Type":"ContainerStarted","Data":"1d913d58d3ac515a886a8a4f2b399f45af90604494172b6b18c97cd38c6ff496"} Apr 17 11:16:27.828370 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.828346 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zc47d" event={"ID":"7a6e582c-8fc4-4d48-a9f1-63fa4e09787a","Type":"ContainerStarted","Data":"73e0456ab4d00e91bae66125727aa34e763aa787d0ed0fff1e6e0a847614d65a"} Apr 17 11:16:27.834587 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.834537 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lqhrs" podStartSLOduration=4.679069875 podStartE2EDuration="20.834522246s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:16:10.152239941 +0000 UTC m=+3.068764848" lastFinishedPulling="2026-04-17 11:16:26.307692297 +0000 UTC m=+19.224217219" observedRunningTime="2026-04-17 11:16:27.833707246 +0000 UTC m=+20.750232172" watchObservedRunningTime="2026-04-17 11:16:27.834522246 +0000 UTC m=+20.751047175" Apr 17 11:16:27.850786 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.850748 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zc47d" podStartSLOduration=4.347117255 podStartE2EDuration="20.850737616s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:16:10.148874837 +0000 UTC m=+3.065399742" lastFinishedPulling="2026-04-17 11:16:26.652495184 +0000 UTC m=+19.569020103" observedRunningTime="2026-04-17 11:16:27.850673172 +0000 UTC m=+20.767198097" watchObservedRunningTime="2026-04-17 11:16:27.850737616 +0000 UTC m=+20.767262565" Apr 17 11:16:27.865746 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.865708 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k8djp" podStartSLOduration=4.400592792 podStartE2EDuration="20.86570023s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:16:10.152770518 +0000 UTC m=+3.069295423" lastFinishedPulling="2026-04-17 11:16:26.617877943 +0000 UTC m=+19.534402861" observedRunningTime="2026-04-17 11:16:27.865586347 +0000 UTC m=+20.782111272" watchObservedRunningTime="2026-04-17 11:16:27.86570023 +0000 UTC m=+20.782225156" Apr 17 11:16:27.903143 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.903076 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gh4f5" podStartSLOduration=4.43542326 podStartE2EDuration="20.903064445s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:16:10.150250119 +0000 UTC m=+3.066775023" lastFinishedPulling="2026-04-17 11:16:26.617891297 +0000 UTC m=+19.534416208" observedRunningTime="2026-04-17 11:16:27.881809707 +0000 UTC m=+20.798334633" watchObservedRunningTime="2026-04-17 11:16:27.903064445 +0000 UTC m=+20.819589374" Apr 17 11:16:27.920916 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:27.920887 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5pxkv" podStartSLOduration=9.002137962 podStartE2EDuration="20.920878653s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:16:10.154685265 +0000 UTC m=+3.071210173" lastFinishedPulling="2026-04-17 11:16:22.073425949 +0000 UTC m=+14.989950864" observedRunningTime="2026-04-17 11:16:27.920137654 +0000 UTC m=+20.836662574" watchObservedRunningTime="2026-04-17 11:16:27.920878653 +0000 UTC m=+20.837403578" Apr 17 11:16:28.188195 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:28.188164 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:28.627234 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:28.627058 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:28.188185482Z","UUID":"310b2fe5-4218-4538-a3e8-6c45992f8d75","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:28.631648 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:28.630727 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:28.631648 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:28.630762 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:28.682059 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:28.682028 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:28.682216 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:28.682028 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:28.682216 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:28.682169 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:28.682318 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:28.682245 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:28.832025 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:28.831972 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-npwhk" event={"ID":"e8a849a2-893a-4e45-b82e-22ee8ac74d6e","Type":"ContainerStarted","Data":"7bdbb4a6863551adeb93b39c0d2e6925f2bf2030151fb5d2a3ff304d10bab2fc"} Apr 17 11:16:28.833977 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:28.833950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" event={"ID":"dec438d4-4f5d-4e28-b4c4-3f1139b211ff","Type":"ContainerStarted","Data":"84d08a6c9abd15f47b123820675f1a838f94c6daaf326393e0b1cd5ef0806044"} Apr 17 11:16:28.847679 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:28.847624 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-npwhk" podStartSLOduration=5.687029634 podStartE2EDuration="21.847611517s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:16:10.147109219 +0000 UTC m=+3.063634123" lastFinishedPulling="2026-04-17 11:16:26.307691088 +0000 UTC m=+19.224216006" observedRunningTime="2026-04-17 11:16:28.84725574 +0000 UTC m=+21.763780666" watchObservedRunningTime="2026-04-17 11:16:28.847611517 +0000 UTC m=+21.764136443" Apr 17 11:16:29.210829 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:29.210805 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:29.211422 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:29.211403 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:29.682411 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:29.682169 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:29.682675 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:29.682525 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:29.839037 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:29.839015 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/0.log" Apr 17 11:16:29.839412 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:29.839390 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerStarted","Data":"bd430c5306f82b370a710cec347191fc4353f421d021def19a6ab60d3de20272"} Apr 17 11:16:29.841800 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:29.841767 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" event={"ID":"dec438d4-4f5d-4e28-b4c4-3f1139b211ff","Type":"ContainerStarted","Data":"3b42766189ce391948b05826c88ebc73713a324e4c990a5897a372dd1fca3c42"} Apr 17 11:16:29.861524 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:29.861463 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7g74t" podStartSLOduration=3.9457975210000003 podStartE2EDuration="22.86144939s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:16:10.155456578 +0000 UTC m=+3.071981484" lastFinishedPulling="2026-04-17 11:16:29.071108441 +0000 UTC m=+21.987633353" observedRunningTime="2026-04-17 11:16:29.861423615 +0000 UTC m=+22.777948541" watchObservedRunningTime="2026-04-17 11:16:29.86144939 +0000 UTC m=+22.777974316" Apr 17 11:16:30.682026 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:30.681993 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:30.682026 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:30.682028 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:30.682318 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:30.682131 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:30.682318 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:30.682234 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:30.843100 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:30.843075 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:31.682509 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:31.682478 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:31.682657 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:31.682577 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:32.682385 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.682193 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:32.682876 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.682193 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:32.682876 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:32.682462 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:32.682876 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:32.682523 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:32.847528 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.847500 2571 generic.go:358] "Generic (PLEG): container finished" podID="a61e6396-9d01-4767-84e7-6240ed2764cc" containerID="a90edc72779492def16ca74aa5670f3e576ce225483f0d548d76678a7fc30ee1" exitCode=0 Apr 17 11:16:32.847687 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.847591 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d76ls" event={"ID":"a61e6396-9d01-4767-84e7-6240ed2764cc","Type":"ContainerDied","Data":"a90edc72779492def16ca74aa5670f3e576ce225483f0d548d76678a7fc30ee1"} Apr 17 11:16:32.850389 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.850373 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/0.log" Apr 17 11:16:32.850650 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.850622 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerStarted","Data":"7d360d682fed224adf22d7e8f9ab824194be251c2a8faf96e633da21da831119"} Apr 17 11:16:32.850867 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.850852 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:32.850867 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.850867 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:32.851099 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.851082 2571 scope.go:117] "RemoveContainer" containerID="9c2b3f7ec7f1100b85def5dbed45d8320df1dbe9560ee5984bd91ae790107161" Apr 17 11:16:32.852676 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.852655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:32.852775 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:32.852762 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:32.852827 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:32.852804 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret podName:75fcf97e-549f-48be-9a3a-142bc6a20eaa nodeName:}" failed. No retries permitted until 2026-04-17 11:16:48.85279271 +0000 UTC m=+41.769317614 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret") pod "global-pull-secret-syncer-n25xj" (UID: "75fcf97e-549f-48be-9a3a-142bc6a20eaa") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:32.865373 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:32.865355 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:33.681779 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.681560 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:33.681913 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:33.681818 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:33.850797 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.850764 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-n25xj"] Apr 17 11:16:33.851299 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.850919 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:33.851299 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:33.851040 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:33.854350 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.854326 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w7npk"] Apr 17 11:16:33.854481 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.854385 2571 generic.go:358] "Generic (PLEG): container finished" podID="a61e6396-9d01-4767-84e7-6240ed2764cc" containerID="23c19492f3036b600bb0d1320664c9ecb6fa741b21a6a838bacb27d3ea95927e" exitCode=0 Apr 17 11:16:33.854481 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.854430 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:33.854590 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.854485 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d76ls" event={"ID":"a61e6396-9d01-4767-84e7-6240ed2764cc","Type":"ContainerDied","Data":"23c19492f3036b600bb0d1320664c9ecb6fa741b21a6a838bacb27d3ea95927e"} Apr 17 11:16:33.854590 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:33.854516 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:33.855163 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.855140 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6zdmq"] Apr 17 11:16:33.858475 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.858458 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/0.log" Apr 17 11:16:33.858845 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.858820 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" event={"ID":"1fa09637-267c-4a4b-8aac-54287c81cc4e","Type":"ContainerStarted","Data":"737f007aa7f3bbc6dd58a3fa29f27cf370240bae91e439cd151a3bcff7249215"} Apr 17 11:16:33.858916 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.858850 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:33.858965 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:33.858931 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:33.859266 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.859245 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:33.874975 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.874953 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:16:33.918446 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:33.918409 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" podStartSLOduration=10.217442162 podStartE2EDuration="26.918396909s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:16:10.154690411 +0000 UTC m=+3.071215321" lastFinishedPulling="2026-04-17 11:16:26.855645149 +0000 UTC m=+19.772170068" observedRunningTime="2026-04-17 11:16:33.918054089 +0000 UTC m=+26.834579025" watchObservedRunningTime="2026-04-17 11:16:33.918396909 +0000 UTC m=+26.834921834" Apr 17 11:16:34.862696 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:34.862666 2571 generic.go:358] "Generic (PLEG): container finished" podID="a61e6396-9d01-4767-84e7-6240ed2764cc" containerID="4ae39fb3d101cb7636c5a3d431e2a1b1d45dd34a673402ca0d8293ee12e4e438" exitCode=0 Apr 17 11:16:34.863133 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:34.862750 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d76ls" event={"ID":"a61e6396-9d01-4767-84e7-6240ed2764cc","Type":"ContainerDied","Data":"4ae39fb3d101cb7636c5a3d431e2a1b1d45dd34a673402ca0d8293ee12e4e438"} Apr 17 11:16:35.682045 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:35.682013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:35.682257 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:35.682054 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:35.682257 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:35.682013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:35.682257 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:35.682156 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:35.682257 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:35.682225 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:35.682455 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:35.682313 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:37.683028 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:37.682870 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:37.683028 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:37.682903 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:37.683028 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:37.682974 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:37.683803 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:37.683043 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:37.683803 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:37.683035 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:37.683803 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:37.683163 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:38.666296 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:38.666219 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:38.666445 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:38.666357 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:38.666889 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:38.666846 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5pxkv" Apr 17 11:16:39.682457 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:39.682427 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:39.682874 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:39.682432 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:39.682874 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:39.682569 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:39.682874 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:39.682529 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:16:39.682874 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:39.682646 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-w7npk" podUID="c41a7def-7809-48c2-80fa-7078299705ca" Apr 17 11:16:39.682874 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:39.682763 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n25xj" podUID="75fcf97e-549f-48be-9a3a-142bc6a20eaa" Apr 17 11:16:40.876975 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.876942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d76ls" event={"ID":"a61e6396-9d01-4767-84e7-6240ed2764cc","Type":"ContainerStarted","Data":"3fa623236ff6969e382cdee9d455bd96ddf9c1b0ee5b5aad1d013283b53eae50"} Apr 17 11:16:40.889629 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.889608 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-16.ec2.internal" event="NodeReady" Apr 17 11:16:40.889772 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.889761 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:16:40.932538 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.932510 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7zgfs"] Apr 17 11:16:40.948262 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.948186 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kz66m"] Apr 17 11:16:40.948423 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.948399 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:40.954247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.951328 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:16:40.954247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.951347 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:16:40.954426 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.954266 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvwdc\"" Apr 17 11:16:40.964273 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.964256 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7zgfs"] Apr 17 11:16:40.964273 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.964276 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kz66m"] Apr 17 11:16:40.964389 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.964350 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:40.967034 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.967018 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:16:40.967164 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.967044 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-whkkk\"" Apr 17 11:16:40.967164 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.967096 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:16:40.967269 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:40.967224 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:16:41.120563 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.120532 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:41.120563 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.120565 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tlsq\" (UniqueName: \"kubernetes.io/projected/f0b90299-ec82-4706-837f-42097067ec57-kube-api-access-6tlsq\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:41.120764 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.120594 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87464ef4-c119-49ba-bee1-c792066f9cd0-tmp-dir\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.120764 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.120676 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87464ef4-c119-49ba-bee1-c792066f9cd0-config-volume\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.120764 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.120713 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.120764 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.120734 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztv96\" (UniqueName: \"kubernetes.io/projected/87464ef4-c119-49ba-bee1-c792066f9cd0-kube-api-access-ztv96\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.221941 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.221853 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87464ef4-c119-49ba-bee1-c792066f9cd0-config-volume\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.221941 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.221903 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.221941 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.221923 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztv96\" (UniqueName: \"kubernetes.io/projected/87464ef4-c119-49ba-bee1-c792066f9cd0-kube-api-access-ztv96\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.222253 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.222020 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:41.222253 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.222034 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:41.222253 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.222065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tlsq\" (UniqueName: \"kubernetes.io/projected/f0b90299-ec82-4706-837f-42097067ec57-kube-api-access-6tlsq\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:41.222253 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.222090 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls podName:87464ef4-c119-49ba-bee1-c792066f9cd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.722070185 +0000 UTC m=+34.638595103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls") pod "dns-default-7zgfs" (UID: "87464ef4-c119-49ba-bee1-c792066f9cd0") : secret "dns-default-metrics-tls" not found Apr 17 11:16:41.222253 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.222175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87464ef4-c119-49ba-bee1-c792066f9cd0-tmp-dir\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.222253 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.222234 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:41.222456 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.222287 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert podName:f0b90299-ec82-4706-837f-42097067ec57 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:41.722271369 +0000 UTC m=+34.638796277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert") pod "ingress-canary-kz66m" (UID: "f0b90299-ec82-4706-837f-42097067ec57") : secret "canary-serving-cert" not found Apr 17 11:16:41.222456 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.222393 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87464ef4-c119-49ba-bee1-c792066f9cd0-config-volume\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.222456 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.222449 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/87464ef4-c119-49ba-bee1-c792066f9cd0-tmp-dir\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.232721 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.232692 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztv96\" (UniqueName: \"kubernetes.io/projected/87464ef4-c119-49ba-bee1-c792066f9cd0-kube-api-access-ztv96\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.232839 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.232756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tlsq\" (UniqueName: \"kubernetes.io/projected/f0b90299-ec82-4706-837f-42097067ec57-kube-api-access-6tlsq\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:41.323162 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.323105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:41.323331 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.323256 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:41.323331 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.323317 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs podName:39a4801b-dc46-42c1-a6fe-8a2d79362e6b nodeName:}" failed. No retries permitted until 2026-04-17 11:17:13.323303534 +0000 UTC m=+66.239828438 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs") pod "network-metrics-daemon-6zdmq" (UID: "39a4801b-dc46-42c1-a6fe-8a2d79362e6b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:41.424312 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.424285 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:41.424451 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.424412 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:41.424451 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.424427 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:41.424451 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.424435 2571 projected.go:194] Error preparing data for projected volume kube-api-access-l8wvn for pod openshift-network-diagnostics/network-check-target-w7npk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:41.424556 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.424475 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn podName:c41a7def-7809-48c2-80fa-7078299705ca nodeName:}" failed. No retries permitted until 2026-04-17 11:17:13.424463446 +0000 UTC m=+66.340988351 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-l8wvn" (UniqueName: "kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn") pod "network-check-target-w7npk" (UID: "c41a7def-7809-48c2-80fa-7078299705ca") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:41.682365 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.682296 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:16:41.682494 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.682300 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:16:41.682589 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.682300 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:41.685231 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.685213 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:16:41.685409 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.685397 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dlffp\"" Apr 17 11:16:41.686549 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.686533 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:16:41.686613 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.686576 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q4j6t\"" Apr 17 11:16:41.686650 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.686632 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:16:41.686695 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.686651 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:16:41.726355 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.726328 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:41.726467 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.726401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:41.726509 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.726466 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:41.726542 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.726523 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert podName:f0b90299-ec82-4706-837f-42097067ec57 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.726508854 +0000 UTC m=+35.643033763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert") pod "ingress-canary-kz66m" (UID: "f0b90299-ec82-4706-837f-42097067ec57") : secret "canary-serving-cert" not found Apr 17 11:16:41.726581 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.726472 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:41.726611 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:41.726596 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls podName:87464ef4-c119-49ba-bee1-c792066f9cd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:42.726583694 +0000 UTC m=+35.643108598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls") pod "dns-default-7zgfs" (UID: "87464ef4-c119-49ba-bee1-c792066f9cd0") : secret "dns-default-metrics-tls" not found Apr 17 11:16:41.881194 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.881157 2571 generic.go:358] "Generic (PLEG): container finished" podID="a61e6396-9d01-4767-84e7-6240ed2764cc" containerID="3fa623236ff6969e382cdee9d455bd96ddf9c1b0ee5b5aad1d013283b53eae50" exitCode=0 Apr 17 11:16:41.881571 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:41.881200 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d76ls" event={"ID":"a61e6396-9d01-4767-84e7-6240ed2764cc","Type":"ContainerDied","Data":"3fa623236ff6969e382cdee9d455bd96ddf9c1b0ee5b5aad1d013283b53eae50"} Apr 17 11:16:42.733550 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:42.733353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:42.733767 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:42.733502 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:42.733767 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:42.733619 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:42.733767 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:42.733674 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert podName:f0b90299-ec82-4706-837f-42097067ec57 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.733656057 +0000 UTC m=+37.650180979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert") pod "ingress-canary-kz66m" (UID: "f0b90299-ec82-4706-837f-42097067ec57") : secret "canary-serving-cert" not found Apr 17 11:16:42.733767 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:42.733711 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:42.733767 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:42.733751 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls podName:87464ef4-c119-49ba-bee1-c792066f9cd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.733740092 +0000 UTC m=+37.650264998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls") pod "dns-default-7zgfs" (UID: "87464ef4-c119-49ba-bee1-c792066f9cd0") : secret "dns-default-metrics-tls" not found Apr 17 11:16:42.886009 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:42.885977 2571 generic.go:358] "Generic (PLEG): container finished" podID="a61e6396-9d01-4767-84e7-6240ed2764cc" containerID="276e920559791850aefb9f6b5a7874224deebfdc8ce7c9a3e1fe8eed24a4381a" exitCode=0 Apr 17 11:16:42.886447 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:42.886038 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d76ls" event={"ID":"a61e6396-9d01-4767-84e7-6240ed2764cc","Type":"ContainerDied","Data":"276e920559791850aefb9f6b5a7874224deebfdc8ce7c9a3e1fe8eed24a4381a"} Apr 17 11:16:43.890643 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:43.890614 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d76ls" event={"ID":"a61e6396-9d01-4767-84e7-6240ed2764cc","Type":"ContainerStarted","Data":"829242b0bb10d9ef3bb9735b422d0bc788c7d744d91692f8401c3018e8bd2619"} Apr 17 11:16:43.916064 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:43.916024 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d76ls" podStartSLOduration=6.456799912 podStartE2EDuration="36.916011124s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:16:10.145540267 +0000 UTC m=+3.062065171" lastFinishedPulling="2026-04-17 11:16:40.604751479 +0000 UTC m=+33.521276383" observedRunningTime="2026-04-17 11:16:43.914362552 +0000 UTC m=+36.830887478" watchObservedRunningTime="2026-04-17 11:16:43.916011124 +0000 UTC m=+36.832536028" Apr 17 11:16:44.748524 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:44.748488 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:44.748524 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:44.748527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:44.748722 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:44.748618 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:44.748722 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:44.748627 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:44.748722 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:44.748670 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert podName:f0b90299-ec82-4706-837f-42097067ec57 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:48.74865477 +0000 UTC m=+41.665179676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert") pod "ingress-canary-kz66m" (UID: "f0b90299-ec82-4706-837f-42097067ec57") : secret "canary-serving-cert" not found Apr 17 11:16:44.748722 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:44.748683 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls podName:87464ef4-c119-49ba-bee1-c792066f9cd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:48.748677755 +0000 UTC m=+41.665202659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls") pod "dns-default-7zgfs" (UID: "87464ef4-c119-49ba-bee1-c792066f9cd0") : secret "dns-default-metrics-tls" not found Apr 17 11:16:48.776836 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:48.776798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:48.776836 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:48.776843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:48.777282 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:48.776930 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:48.777282 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:48.776934 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:48.777282 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:48.776988 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert podName:f0b90299-ec82-4706-837f-42097067ec57 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:56.776971349 +0000 UTC m=+49.693496253 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert") pod "ingress-canary-kz66m" (UID: "f0b90299-ec82-4706-837f-42097067ec57") : secret "canary-serving-cert" not found Apr 17 11:16:48.777282 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:48.777006 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls podName:87464ef4-c119-49ba-bee1-c792066f9cd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:56.776998294 +0000 UTC m=+49.693523198 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls") pod "dns-default-7zgfs" (UID: "87464ef4-c119-49ba-bee1-c792066f9cd0") : secret "dns-default-metrics-tls" not found Apr 17 11:16:48.877672 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:48.877646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:48.881027 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:48.881008 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75fcf97e-549f-48be-9a3a-142bc6a20eaa-original-pull-secret\") pod \"global-pull-secret-syncer-n25xj\" (UID: \"75fcf97e-549f-48be-9a3a-142bc6a20eaa\") " pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:48.901292 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:48.901269 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n25xj" Apr 17 11:16:49.019437 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:49.019411 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-n25xj"] Apr 17 11:16:49.022592 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:49.022561 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75fcf97e_549f_48be_9a3a_142bc6a20eaa.slice/crio-4462aba6a071cfb50075fdfa593cb7b9cbfc2cadc9795c70ed68041887f035db WatchSource:0}: Error finding container 4462aba6a071cfb50075fdfa593cb7b9cbfc2cadc9795c70ed68041887f035db: Status 404 returned error can't find the container with id 4462aba6a071cfb50075fdfa593cb7b9cbfc2cadc9795c70ed68041887f035db Apr 17 11:16:49.903931 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:49.903892 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-n25xj" event={"ID":"75fcf97e-549f-48be-9a3a-142bc6a20eaa","Type":"ContainerStarted","Data":"4462aba6a071cfb50075fdfa593cb7b9cbfc2cadc9795c70ed68041887f035db"} Apr 17 11:16:52.727233 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.727209 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z"] Apr 17 11:16:52.745283 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.745264 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" Apr 17 11:16:52.749494 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.749468 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 11:16:52.749614 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.749524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 11:16:52.749614 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.749544 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-9vbk8\"" Apr 17 11:16:52.749614 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.749553 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 11:16:52.749614 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.749597 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 11:16:52.752260 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.752241 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z"] Apr 17 11:16:52.762462 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.762445 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4"] Apr 17 11:16:52.772495 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.772481 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:52.776061 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.776046 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 11:16:52.776173 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.776099 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 11:16:52.776406 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.776391 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 11:16:52.776622 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.776607 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 11:16:52.781611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.781590 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4"] Apr 17 11:16:52.905422 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.905391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/26524d89-e7bf-45b4-b67a-22d408455003-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-76777d7659-qkr7z\" (UID: \"26524d89-e7bf-45b4-b67a-22d408455003\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" Apr 17 11:16:52.905560 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.905455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:52.905560 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.905504 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4rqd\" (UniqueName: \"kubernetes.io/projected/151d1dc0-4349-4e64-bd4a-f976c47520fb-kube-api-access-x4rqd\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:52.905560 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.905532 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-hub\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:52.905724 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.905605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:52.905724 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.905648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-ca\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:52.905724 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.905668 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/151d1dc0-4349-4e64-bd4a-f976c47520fb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:52.905724 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.905689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78rvq\" (UniqueName: \"kubernetes.io/projected/26524d89-e7bf-45b4-b67a-22d408455003-kube-api-access-78rvq\") pod \"managed-serviceaccount-addon-agent-76777d7659-qkr7z\" (UID: \"26524d89-e7bf-45b4-b67a-22d408455003\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" Apr 17 11:16:52.910929 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.910868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-n25xj" event={"ID":"75fcf97e-549f-48be-9a3a-142bc6a20eaa","Type":"ContainerStarted","Data":"2ca2dd41a0124fd67f7cff4d80fd8458662d2d3d78b429330b2ee2713e8897e1"} Apr 17 11:16:52.927524 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:52.927485 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-n25xj" podStartSLOduration=32.294023234 podStartE2EDuration="35.927473691s" podCreationTimestamp="2026-04-17 11:16:17 +0000 UTC" firstStartedPulling="2026-04-17 11:16:49.024310442 +0000 UTC m=+41.940835345" lastFinishedPulling="2026-04-17 11:16:52.657760885 +0000 UTC m=+45.574285802" observedRunningTime="2026-04-17 11:16:52.926432888 +0000 UTC m=+45.842957813" watchObservedRunningTime="2026-04-17 11:16:52.927473691 +0000 UTC m=+45.843998613" Apr 17 11:16:53.006974 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.006949 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/26524d89-e7bf-45b4-b67a-22d408455003-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-76777d7659-qkr7z\" (UID: \"26524d89-e7bf-45b4-b67a-22d408455003\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" Apr 17 11:16:53.007069 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.007009 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.007069 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.007043 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4rqd\" (UniqueName: \"kubernetes.io/projected/151d1dc0-4349-4e64-bd4a-f976c47520fb-kube-api-access-x4rqd\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.007069 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.007065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-hub\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.007226 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.007097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.007347 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.007320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-ca\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.007465 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.007364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/151d1dc0-4349-4e64-bd4a-f976c47520fb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.007465 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.007397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78rvq\" (UniqueName: \"kubernetes.io/projected/26524d89-e7bf-45b4-b67a-22d408455003-kube-api-access-78rvq\") pod \"managed-serviceaccount-addon-agent-76777d7659-qkr7z\" (UID: \"26524d89-e7bf-45b4-b67a-22d408455003\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" Apr 17 11:16:53.008376 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.008351 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/151d1dc0-4349-4e64-bd4a-f976c47520fb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.009413 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.009368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.009639 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.009615 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/26524d89-e7bf-45b4-b67a-22d408455003-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-76777d7659-qkr7z\" (UID: \"26524d89-e7bf-45b4-b67a-22d408455003\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" Apr 17 11:16:53.009717 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.009682 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-hub\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.009774 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.009731 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-ca\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.009851 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.009834 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/151d1dc0-4349-4e64-bd4a-f976c47520fb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.016877 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.016858 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4rqd\" (UniqueName: \"kubernetes.io/projected/151d1dc0-4349-4e64-bd4a-f976c47520fb-kube-api-access-x4rqd\") pod \"cluster-proxy-proxy-agent-7459b48476-5jzl4\" (UID: \"151d1dc0-4349-4e64-bd4a-f976c47520fb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.023184 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.023165 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78rvq\" (UniqueName: \"kubernetes.io/projected/26524d89-e7bf-45b4-b67a-22d408455003-kube-api-access-78rvq\") pod \"managed-serviceaccount-addon-agent-76777d7659-qkr7z\" (UID: \"26524d89-e7bf-45b4-b67a-22d408455003\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" Apr 17 11:16:53.080936 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.080914 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" Apr 17 11:16:53.102818 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.102800 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:16:53.222196 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.221822 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z"] Apr 17 11:16:53.224661 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:53.224631 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26524d89_e7bf_45b4_b67a_22d408455003.slice/crio-dd35f0dad3c510c23430268770eb52eb553d6c0b6c419dfa805d6357074b42d3 WatchSource:0}: Error finding container dd35f0dad3c510c23430268770eb52eb553d6c0b6c419dfa805d6357074b42d3: Status 404 returned error can't find the container with id dd35f0dad3c510c23430268770eb52eb553d6c0b6c419dfa805d6357074b42d3 Apr 17 11:16:53.237710 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.237685 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4"] Apr 17 11:16:53.240042 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:16:53.240022 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod151d1dc0_4349_4e64_bd4a_f976c47520fb.slice/crio-c2f2dfa6433ecea34bd4f7113e41b0ab586554de4a9c21869fdde1080ccde115 WatchSource:0}: Error finding container c2f2dfa6433ecea34bd4f7113e41b0ab586554de4a9c21869fdde1080ccde115: Status 404 returned error can't find the container with id c2f2dfa6433ecea34bd4f7113e41b0ab586554de4a9c21869fdde1080ccde115 Apr 17 11:16:53.914502 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.914463 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" event={"ID":"151d1dc0-4349-4e64-bd4a-f976c47520fb","Type":"ContainerStarted","Data":"c2f2dfa6433ecea34bd4f7113e41b0ab586554de4a9c21869fdde1080ccde115"} Apr 17 11:16:53.915667 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:53.915617 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" event={"ID":"26524d89-e7bf-45b4-b67a-22d408455003","Type":"ContainerStarted","Data":"dd35f0dad3c510c23430268770eb52eb553d6c0b6c419dfa805d6357074b42d3"} Apr 17 11:16:56.835397 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:56.835364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:16:56.835397 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:56.835404 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:16:56.835815 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:56.835496 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:56.835815 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:56.835499 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:56.835815 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:56.835545 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert podName:f0b90299-ec82-4706-837f-42097067ec57 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:12.835532524 +0000 UTC m=+65.752057428 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert") pod "ingress-canary-kz66m" (UID: "f0b90299-ec82-4706-837f-42097067ec57") : secret "canary-serving-cert" not found Apr 17 11:16:56.835815 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:16:56.835557 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls podName:87464ef4-c119-49ba-bee1-c792066f9cd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:12.835551838 +0000 UTC m=+65.752076742 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls") pod "dns-default-7zgfs" (UID: "87464ef4-c119-49ba-bee1-c792066f9cd0") : secret "dns-default-metrics-tls" not found Apr 17 11:16:57.924537 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:57.924500 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" event={"ID":"151d1dc0-4349-4e64-bd4a-f976c47520fb","Type":"ContainerStarted","Data":"86776d2cb75004eed98e731bca5e1e1a94f7cb43ececa6ea77f090727aae74bb"} Apr 17 11:16:57.925664 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:57.925638 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" event={"ID":"26524d89-e7bf-45b4-b67a-22d408455003","Type":"ContainerStarted","Data":"b1a72fba063649664ccb043dd567a9a90f3ac682b6528900efa05d9224df37e4"} Apr 17 11:16:57.941420 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:16:57.941383 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" podStartSLOduration=1.997735371 podStartE2EDuration="5.941350937s" podCreationTimestamp="2026-04-17 11:16:52 +0000 UTC" firstStartedPulling="2026-04-17 11:16:53.226429162 +0000 UTC m=+46.142954066" lastFinishedPulling="2026-04-17 11:16:57.170044714 +0000 UTC m=+50.086569632" observedRunningTime="2026-04-17 11:16:57.941187268 +0000 UTC m=+50.857712221" watchObservedRunningTime="2026-04-17 11:16:57.941350937 +0000 UTC m=+50.857875863" Apr 17 11:17:00.932686 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:00.932648 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" event={"ID":"151d1dc0-4349-4e64-bd4a-f976c47520fb","Type":"ContainerStarted","Data":"1e60854e31713e597321e9ccc7a5ca749c75ce388480b08ca5c6499e6850aaf2"} Apr 17 11:17:00.932686 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:00.932685 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" event={"ID":"151d1dc0-4349-4e64-bd4a-f976c47520fb","Type":"ContainerStarted","Data":"777923e259bb7268bbea75dfa491d6781c62bc622c8e27b28acf6699f630d779"} Apr 17 11:17:00.953583 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:00.953531 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" podStartSLOduration=2.263557874 podStartE2EDuration="8.953519377s" podCreationTimestamp="2026-04-17 11:16:52 +0000 UTC" firstStartedPulling="2026-04-17 11:16:53.241504114 +0000 UTC m=+46.158029019" lastFinishedPulling="2026-04-17 11:16:59.931465603 +0000 UTC m=+52.847990522" observedRunningTime="2026-04-17 11:17:00.95189515 +0000 UTC m=+53.868420088" watchObservedRunningTime="2026-04-17 11:17:00.953519377 +0000 UTC m=+53.870044303" Apr 17 11:17:05.876154 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:05.876111 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xmrcg" Apr 17 11:17:12.840994 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:12.840951 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:17:12.840994 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:12.840998 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:17:12.841528 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:12.841084 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:12.841528 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:12.841088 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:12.841528 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:12.841152 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert podName:f0b90299-ec82-4706-837f-42097067ec57 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:44.841138715 +0000 UTC m=+97.757663620 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert") pod "ingress-canary-kz66m" (UID: "f0b90299-ec82-4706-837f-42097067ec57") : secret "canary-serving-cert" not found Apr 17 11:17:12.841528 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:12.841181 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls podName:87464ef4-c119-49ba-bee1-c792066f9cd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:44.841166621 +0000 UTC m=+97.757691526 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls") pod "dns-default-7zgfs" (UID: "87464ef4-c119-49ba-bee1-c792066f9cd0") : secret "dns-default-metrics-tls" not found Apr 17 11:17:13.345949 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.345923 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:17:13.348807 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.348788 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:13.357152 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:13.357135 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:13.357223 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:13.357201 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs podName:39a4801b-dc46-42c1-a6fe-8a2d79362e6b nodeName:}" failed. No retries permitted until 2026-04-17 11:18:17.357185322 +0000 UTC m=+130.273710226 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs") pod "network-metrics-daemon-6zdmq" (UID: "39a4801b-dc46-42c1-a6fe-8a2d79362e6b") : secret "metrics-daemon-secret" not found Apr 17 11:17:13.447120 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.447090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:17:13.450314 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.450298 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:17:13.460821 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.460800 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:17:13.471514 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.471488 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8wvn\" (UniqueName: \"kubernetes.io/projected/c41a7def-7809-48c2-80fa-7078299705ca-kube-api-access-l8wvn\") pod \"network-check-target-w7npk\" (UID: \"c41a7def-7809-48c2-80fa-7078299705ca\") " pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:17:13.494919 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.494898 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dlffp\"" Apr 17 11:17:13.503281 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.503264 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:17:13.617188 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.617163 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-w7npk"] Apr 17 11:17:13.617445 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:17:13.617412 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41a7def_7809_48c2_80fa_7078299705ca.slice/crio-37e4dba89694b703926fbff9cee75559c70f8d2fcb9b4c397d488a075c124a17 WatchSource:0}: Error finding container 37e4dba89694b703926fbff9cee75559c70f8d2fcb9b4c397d488a075c124a17: Status 404 returned error can't find the container with id 37e4dba89694b703926fbff9cee75559c70f8d2fcb9b4c397d488a075c124a17 Apr 17 11:17:13.960609 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:13.960535 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w7npk" event={"ID":"c41a7def-7809-48c2-80fa-7078299705ca","Type":"ContainerStarted","Data":"37e4dba89694b703926fbff9cee75559c70f8d2fcb9b4c397d488a075c124a17"} Apr 17 11:17:16.969512 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:16.969477 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-w7npk" event={"ID":"c41a7def-7809-48c2-80fa-7078299705ca","Type":"ContainerStarted","Data":"2acd1ee022008c2b41236abfdd17a34c0459837ef02498ad43f863e734d4c65c"} Apr 17 11:17:16.969930 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:16.969595 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:17:16.987048 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:16.987004 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-w7npk" podStartSLOduration=67.384780458 podStartE2EDuration="1m9.986991544s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:17:13.619216986 +0000 UTC m=+66.535741898" lastFinishedPulling="2026-04-17 11:17:16.22142808 +0000 UTC m=+69.137952984" observedRunningTime="2026-04-17 11:17:16.98610517 +0000 UTC m=+69.902630095" watchObservedRunningTime="2026-04-17 11:17:16.986991544 +0000 UTC m=+69.903516469" Apr 17 11:17:44.870635 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:44.870586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:17:44.870635 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:44.870640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:17:44.871079 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:44.870730 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:44.871079 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:44.870733 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:44.871079 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:44.870791 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert podName:f0b90299-ec82-4706-837f-42097067ec57 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:48.870777282 +0000 UTC m=+161.787302186 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert") pod "ingress-canary-kz66m" (UID: "f0b90299-ec82-4706-837f-42097067ec57") : secret "canary-serving-cert" not found Apr 17 11:17:44.871079 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:17:44.870803 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls podName:87464ef4-c119-49ba-bee1-c792066f9cd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:48.870797783 +0000 UTC m=+161.787322687 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls") pod "dns-default-7zgfs" (UID: "87464ef4-c119-49ba-bee1-c792066f9cd0") : secret "dns-default-metrics-tls" not found Apr 17 11:17:47.973772 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:17:47.973747 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-w7npk" Apr 17 11:18:17.401961 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:17.401903 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:18:17.402456 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:18:17.402071 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:18:17.402456 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:18:17.402160 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs podName:39a4801b-dc46-42c1-a6fe-8a2d79362e6b nodeName:}" failed. No retries permitted until 2026-04-17 11:20:19.402140925 +0000 UTC m=+252.318665843 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs") pod "network-metrics-daemon-6zdmq" (UID: "39a4801b-dc46-42c1-a6fe-8a2d79362e6b") : secret "metrics-daemon-secret" not found Apr 17 11:18:35.587714 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:35.587686 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k8djp_88f53a5c-9a8b-457b-9e6e-e62bf112bbb8/dns-node-resolver/0.log" Apr 17 11:18:36.787815 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:36.787788 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lqhrs_06811185-7a8c-419b-9f44-d67b67d794d3/node-ca/0.log" Apr 17 11:18:43.960549 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:18:43.960505 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7zgfs" podUID="87464ef4-c119-49ba-bee1-c792066f9cd0" Apr 17 11:18:43.972425 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:18:43.972393 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kz66m" podUID="f0b90299-ec82-4706-837f-42097067ec57" Apr 17 11:18:44.164698 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:44.164665 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7zgfs" Apr 17 11:18:44.697962 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:18:44.697914 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-6zdmq" podUID="39a4801b-dc46-42c1-a6fe-8a2d79362e6b" Apr 17 11:18:48.919572 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:48.919532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:18:48.919572 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:48.919580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:18:48.921856 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:48.921829 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87464ef4-c119-49ba-bee1-c792066f9cd0-metrics-tls\") pod \"dns-default-7zgfs\" (UID: \"87464ef4-c119-49ba-bee1-c792066f9cd0\") " pod="openshift-dns/dns-default-7zgfs" Apr 17 11:18:48.921983 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:48.921916 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f0b90299-ec82-4706-837f-42097067ec57-cert\") pod \"ingress-canary-kz66m\" (UID: \"f0b90299-ec82-4706-837f-42097067ec57\") " pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:18:48.968562 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:48.968536 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvwdc\"" Apr 17 11:18:48.976919 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:48.976894 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7zgfs" Apr 17 11:18:49.092175 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:49.092141 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7zgfs"] Apr 17 11:18:49.095329 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:18:49.095284 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87464ef4_c119_49ba_bee1_c792066f9cd0.slice/crio-4ab9ea086e148a1f8b24594fd69ce68ad43fe774d7a822209e553cd343e24bf6 WatchSource:0}: Error finding container 4ab9ea086e148a1f8b24594fd69ce68ad43fe774d7a822209e553cd343e24bf6: Status 404 returned error can't find the container with id 4ab9ea086e148a1f8b24594fd69ce68ad43fe774d7a822209e553cd343e24bf6 Apr 17 11:18:49.180818 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:49.180727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7zgfs" event={"ID":"87464ef4-c119-49ba-bee1-c792066f9cd0","Type":"ContainerStarted","Data":"4ab9ea086e148a1f8b24594fd69ce68ad43fe774d7a822209e553cd343e24bf6"} Apr 17 11:18:51.186997 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:51.186955 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7zgfs" event={"ID":"87464ef4-c119-49ba-bee1-c792066f9cd0","Type":"ContainerStarted","Data":"a58b1667cd1d641b723c4ea59abff24b2b98b7a906e244ec6418a6e8345f97d3"} Apr 17 11:18:51.186997 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:51.186991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7zgfs" event={"ID":"87464ef4-c119-49ba-bee1-c792066f9cd0","Type":"ContainerStarted","Data":"215bf1c9c9ac436339a53884c71e54bb596f15256660a9fcd58a48bc6099a16f"} Apr 17 11:18:51.187442 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:51.187085 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7zgfs" Apr 17 11:18:51.205213 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:51.205164 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7zgfs" podStartSLOduration=129.932626499 podStartE2EDuration="2m11.205147979s" podCreationTimestamp="2026-04-17 11:16:40 +0000 UTC" firstStartedPulling="2026-04-17 11:18:49.096969871 +0000 UTC m=+162.013494775" lastFinishedPulling="2026-04-17 11:18:50.369491329 +0000 UTC m=+163.286016255" observedRunningTime="2026-04-17 11:18:51.203863751 +0000 UTC m=+164.120388673" watchObservedRunningTime="2026-04-17 11:18:51.205147979 +0000 UTC m=+164.121672945" Apr 17 11:18:55.682389 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:55.682306 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:18:55.967031 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:55.966957 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fk6ct"] Apr 17 11:18:55.969977 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:55.969962 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:55.976498 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:55.976477 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:18:55.977045 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:55.977028 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:18:55.977457 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:55.977441 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rkd8r\"" Apr 17 11:18:55.979293 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:55.979279 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:18:55.980854 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:55.980841 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:18:55.989018 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:55.988999 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fk6ct"] Apr 17 11:18:56.051254 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.051222 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-gbwjs"] Apr 17 11:18:56.054210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.054192 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-gbwjs" Apr 17 11:18:56.057682 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.057661 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-xcd29\"" Apr 17 11:18:56.057960 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.057945 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 11:18:56.058449 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.058436 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 11:18:56.069923 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.069894 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-gbwjs"] Apr 17 11:18:56.070590 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.070566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a6af175a-d4db-493f-a569-e21c7304b8de-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.070713 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.070625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a6af175a-d4db-493f-a569-e21c7304b8de-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.070713 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.070651 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5snn\" (UniqueName: \"kubernetes.io/projected/a6af175a-d4db-493f-a569-e21c7304b8de-kube-api-access-t5snn\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.070713 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.070671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drmc7\" (UniqueName: \"kubernetes.io/projected/c68c2a92-99c5-42b3-be90-a59b191fd1cd-kube-api-access-drmc7\") pod \"downloads-6bcc868b7-gbwjs\" (UID: \"c68c2a92-99c5-42b3-be90-a59b191fd1cd\") " pod="openshift-console/downloads-6bcc868b7-gbwjs" Apr 17 11:18:56.070819 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.070756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a6af175a-d4db-493f-a569-e21c7304b8de-crio-socket\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.070819 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.070789 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6af175a-d4db-493f-a569-e21c7304b8de-data-volume\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.171637 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.171599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a6af175a-d4db-493f-a569-e21c7304b8de-crio-socket\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.171637 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.171634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6af175a-d4db-493f-a569-e21c7304b8de-data-volume\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.171867 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.171659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a6af175a-d4db-493f-a569-e21c7304b8de-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.171867 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.171694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a6af175a-d4db-493f-a569-e21c7304b8de-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.171867 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.171711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5snn\" (UniqueName: \"kubernetes.io/projected/a6af175a-d4db-493f-a569-e21c7304b8de-kube-api-access-t5snn\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.171867 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.171727 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drmc7\" (UniqueName: \"kubernetes.io/projected/c68c2a92-99c5-42b3-be90-a59b191fd1cd-kube-api-access-drmc7\") pod \"downloads-6bcc868b7-gbwjs\" (UID: \"c68c2a92-99c5-42b3-be90-a59b191fd1cd\") " pod="openshift-console/downloads-6bcc868b7-gbwjs" Apr 17 11:18:56.171867 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.171726 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a6af175a-d4db-493f-a569-e21c7304b8de-crio-socket\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.172049 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.172033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a6af175a-d4db-493f-a569-e21c7304b8de-data-volume\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.172270 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.172245 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a6af175a-d4db-493f-a569-e21c7304b8de-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.174018 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.173993 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a6af175a-d4db-493f-a569-e21c7304b8de-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.182185 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.182165 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5snn\" (UniqueName: \"kubernetes.io/projected/a6af175a-d4db-493f-a569-e21c7304b8de-kube-api-access-t5snn\") pod \"insights-runtime-extractor-fk6ct\" (UID: \"a6af175a-d4db-493f-a569-e21c7304b8de\") " pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.182396 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.182376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drmc7\" (UniqueName: \"kubernetes.io/projected/c68c2a92-99c5-42b3-be90-a59b191fd1cd-kube-api-access-drmc7\") pod \"downloads-6bcc868b7-gbwjs\" (UID: \"c68c2a92-99c5-42b3-be90-a59b191fd1cd\") " pod="openshift-console/downloads-6bcc868b7-gbwjs" Apr 17 11:18:56.278547 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.278450 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fk6ct" Apr 17 11:18:56.362427 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.362394 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-gbwjs" Apr 17 11:18:56.415224 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.414343 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fk6ct"] Apr 17 11:18:56.418952 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:18:56.418925 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6af175a_d4db_493f_a569_e21c7304b8de.slice/crio-949c52f64347fec2092f5006bf0f1dd95854989688859ab3e861c76378aacf88 WatchSource:0}: Error finding container 949c52f64347fec2092f5006bf0f1dd95854989688859ab3e861c76378aacf88: Status 404 returned error can't find the container with id 949c52f64347fec2092f5006bf0f1dd95854989688859ab3e861c76378aacf88 Apr 17 11:18:56.489593 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:56.489567 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-gbwjs"] Apr 17 11:18:56.492801 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:18:56.492776 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68c2a92_99c5_42b3_be90_a59b191fd1cd.slice/crio-3de8a03158dfc895a48df31ee1dc8b5189bf57d300f75d278af6a8eae2863ad5 WatchSource:0}: Error finding container 3de8a03158dfc895a48df31ee1dc8b5189bf57d300f75d278af6a8eae2863ad5: Status 404 returned error can't find the container with id 3de8a03158dfc895a48df31ee1dc8b5189bf57d300f75d278af6a8eae2863ad5 Apr 17 11:18:57.204150 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:57.204092 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-gbwjs" event={"ID":"c68c2a92-99c5-42b3-be90-a59b191fd1cd","Type":"ContainerStarted","Data":"3de8a03158dfc895a48df31ee1dc8b5189bf57d300f75d278af6a8eae2863ad5"} Apr 17 11:18:57.205913 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:57.205876 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fk6ct" event={"ID":"a6af175a-d4db-493f-a569-e21c7304b8de","Type":"ContainerStarted","Data":"54b8049d6088755ff88520503cd0ff88f311b75b534ee6928197e87d0afb0746"} Apr 17 11:18:57.205913 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:57.205916 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fk6ct" event={"ID":"a6af175a-d4db-493f-a569-e21c7304b8de","Type":"ContainerStarted","Data":"632ffedf2ed30edbe8d1c4dae1e7a207d1f79b4f4889cf660a32e46ee1319976"} Apr 17 11:18:57.206093 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:57.205929 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fk6ct" event={"ID":"a6af175a-d4db-493f-a569-e21c7304b8de","Type":"ContainerStarted","Data":"949c52f64347fec2092f5006bf0f1dd95854989688859ab3e861c76378aacf88"} Apr 17 11:18:57.685494 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:57.685190 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:18:57.688794 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:57.688766 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-whkkk\"" Apr 17 11:18:57.696199 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:57.696175 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kz66m" Apr 17 11:18:57.843302 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:57.843248 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kz66m"] Apr 17 11:18:57.846843 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:18:57.846800 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0b90299_ec82_4706_837f_42097067ec57.slice/crio-7fe2ba2b499d6981c221e3b1ae80499c7c361485d85fba314da8bf6dbaf4d9d8 WatchSource:0}: Error finding container 7fe2ba2b499d6981c221e3b1ae80499c7c361485d85fba314da8bf6dbaf4d9d8: Status 404 returned error can't find the container with id 7fe2ba2b499d6981c221e3b1ae80499c7c361485d85fba314da8bf6dbaf4d9d8 Apr 17 11:18:58.210880 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:58.210830 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kz66m" event={"ID":"f0b90299-ec82-4706-837f-42097067ec57","Type":"ContainerStarted","Data":"7fe2ba2b499d6981c221e3b1ae80499c7c361485d85fba314da8bf6dbaf4d9d8"} Apr 17 11:18:58.212266 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:58.212234 2571 generic.go:358] "Generic (PLEG): container finished" podID="26524d89-e7bf-45b4-b67a-22d408455003" containerID="b1a72fba063649664ccb043dd567a9a90f3ac682b6528900efa05d9224df37e4" exitCode=255 Apr 17 11:18:58.212414 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:58.212299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" event={"ID":"26524d89-e7bf-45b4-b67a-22d408455003","Type":"ContainerDied","Data":"b1a72fba063649664ccb043dd567a9a90f3ac682b6528900efa05d9224df37e4"} Apr 17 11:18:58.212671 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:58.212653 2571 scope.go:117] "RemoveContainer" containerID="b1a72fba063649664ccb043dd567a9a90f3ac682b6528900efa05d9224df37e4" Apr 17 11:18:59.217779 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:59.217529 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-76777d7659-qkr7z" event={"ID":"26524d89-e7bf-45b4-b67a-22d408455003","Type":"ContainerStarted","Data":"a3a306835ba8af0f368c2f61a4b30bb45b7c17f72d4dcf3ca564f206a436432d"} Apr 17 11:18:59.220266 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:59.220235 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fk6ct" event={"ID":"a6af175a-d4db-493f-a569-e21c7304b8de","Type":"ContainerStarted","Data":"9a9f4b7806e52f626cf64cb7aa2b1cd92f355fad56e952e6cc482badfc6fabed"} Apr 17 11:18:59.251074 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:18:59.251016 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fk6ct" podStartSLOduration=2.109196081 podStartE2EDuration="4.250995976s" podCreationTimestamp="2026-04-17 11:18:55 +0000 UTC" firstStartedPulling="2026-04-17 11:18:56.487503957 +0000 UTC m=+169.404028862" lastFinishedPulling="2026-04-17 11:18:58.629303847 +0000 UTC m=+171.545828757" observedRunningTime="2026-04-17 11:18:59.250245961 +0000 UTC m=+172.166770887" watchObservedRunningTime="2026-04-17 11:18:59.250995976 +0000 UTC m=+172.167520903" Apr 17 11:19:00.230348 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:00.230262 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kz66m" event={"ID":"f0b90299-ec82-4706-837f-42097067ec57","Type":"ContainerStarted","Data":"cd17962571f047b968990bc01bbe392142961342bab60404014d22924fc71769"} Apr 17 11:19:00.247589 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:00.247538 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kz66m" podStartSLOduration=138.123543287 podStartE2EDuration="2m20.24751893s" podCreationTimestamp="2026-04-17 11:16:40 +0000 UTC" firstStartedPulling="2026-04-17 11:18:57.849342675 +0000 UTC m=+170.765867581" lastFinishedPulling="2026-04-17 11:18:59.973318316 +0000 UTC m=+172.889843224" observedRunningTime="2026-04-17 11:19:00.246366611 +0000 UTC m=+173.162891535" watchObservedRunningTime="2026-04-17 11:19:00.24751893 +0000 UTC m=+173.164043858" Apr 17 11:19:01.192795 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:01.192766 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7zgfs" Apr 17 11:19:11.905031 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.904941 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7"] Apr 17 11:19:11.909550 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.909524 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:11.915436 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.915405 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:19:11.915603 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.915480 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 11:19:11.915838 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.915815 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-66tg5\"" Apr 17 11:19:11.916681 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.916658 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 11:19:11.916790 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.916687 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:19:11.916790 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.916690 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:19:11.922169 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.922146 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7"] Apr 17 11:19:11.940710 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.940682 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cmgrm"] Apr 17 11:19:11.944103 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.944077 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:11.959299 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.959264 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:19:11.959446 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.959374 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:19:11.961586 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.961564 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-dblbs\"" Apr 17 11:19:11.962184 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:11.962167 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:19:12.001347 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-metrics-client-ca\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.001553 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fc99e5e-620d-41ad-80d7-fc34f0995756-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.001553 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001415 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-textfile\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.001553 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001438 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-wtmp\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.001553 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001506 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-root\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.001553 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.001771 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001562 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6fc99e5e-620d-41ad-80d7-fc34f0995756-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.001771 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001602 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc99e5e-620d-41ad-80d7-fc34f0995756-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.001771 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001620 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.001771 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwdgp\" (UniqueName: \"kubernetes.io/projected/6fc99e5e-620d-41ad-80d7-fc34f0995756-kube-api-access-qwdgp\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.001771 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001655 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-tls\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.001771 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001672 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqhcm\" (UniqueName: \"kubernetes.io/projected/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-kube-api-access-mqhcm\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.001771 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.001752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-sys\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103157 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103102 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-metrics-client-ca\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103347 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fc99e5e-620d-41ad-80d7-fc34f0995756-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.103347 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103211 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-textfile\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103347 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-wtmp\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103500 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:19:12.103356 2571 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 11:19:12.103500 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103387 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-wtmp\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103500 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-root\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103500 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:19:12.103439 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fc99e5e-620d-41ad-80d7-fc34f0995756-openshift-state-metrics-tls podName:6fc99e5e-620d-41ad-80d7-fc34f0995756 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:12.603416705 +0000 UTC m=+185.519941629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/6fc99e5e-620d-41ad-80d7-fc34f0995756-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-k8fn7" (UID: "6fc99e5e-620d-41ad-80d7-fc34f0995756") : secret "openshift-state-metrics-tls" not found Apr 17 11:19:12.103500 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103469 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-root\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103500 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6fc99e5e-620d-41ad-80d7-fc34f0995756-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.103796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103578 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc99e5e-620d-41ad-80d7-fc34f0995756-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.103796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwdgp\" (UniqueName: \"kubernetes.io/projected/6fc99e5e-620d-41ad-80d7-fc34f0995756-kube-api-access-qwdgp\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.103796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103674 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-tls\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqhcm\" (UniqueName: \"kubernetes.io/projected/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-kube-api-access-mqhcm\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-sys\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-metrics-client-ca\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.103796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103700 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-textfile\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.104230 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:19:12.103865 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:19:12.104230 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:19:12.103914 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-tls podName:338b62b9-f284-4a24-8fa8-ae9c7f82ce56 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:12.603898132 +0000 UTC m=+185.520423044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-tls") pod "node-exporter-cmgrm" (UID: "338b62b9-f284-4a24-8fa8-ae9c7f82ce56") : secret "node-exporter-tls" not found Apr 17 11:19:12.104230 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.103976 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-sys\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.104230 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.104021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.104745 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.104721 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fc99e5e-620d-41ad-80d7-fc34f0995756-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.106623 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.106596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.106735 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.106651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6fc99e5e-620d-41ad-80d7-fc34f0995756-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.114301 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.114275 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwdgp\" (UniqueName: \"kubernetes.io/projected/6fc99e5e-620d-41ad-80d7-fc34f0995756-kube-api-access-qwdgp\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.115249 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.115224 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqhcm\" (UniqueName: \"kubernetes.io/projected/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-kube-api-access-mqhcm\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.264383 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.264281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-gbwjs" event={"ID":"c68c2a92-99c5-42b3-be90-a59b191fd1cd","Type":"ContainerStarted","Data":"5e16cff6a713cf7648a1279b8b3697554dfccb81551601cc46e2a7294f3a43fb"} Apr 17 11:19:12.264548 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.264518 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-gbwjs" Apr 17 11:19:12.275853 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.275828 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-gbwjs" Apr 17 11:19:12.301592 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.301522 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-gbwjs" podStartSLOduration=1.206266891 podStartE2EDuration="16.301498598s" podCreationTimestamp="2026-04-17 11:18:56 +0000 UTC" firstStartedPulling="2026-04-17 11:18:56.494692015 +0000 UTC m=+169.411216920" lastFinishedPulling="2026-04-17 11:19:11.589923719 +0000 UTC m=+184.506448627" observedRunningTime="2026-04-17 11:19:12.299694129 +0000 UTC m=+185.216219055" watchObservedRunningTime="2026-04-17 11:19:12.301498598 +0000 UTC m=+185.218023526" Apr 17 11:19:12.608388 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.608348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-tls\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.608593 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.608422 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fc99e5e-620d-41ad-80d7-fc34f0995756-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.611210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.611180 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/338b62b9-f284-4a24-8fa8-ae9c7f82ce56-node-exporter-tls\") pod \"node-exporter-cmgrm\" (UID: \"338b62b9-f284-4a24-8fa8-ae9c7f82ce56\") " pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.611336 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.611264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6fc99e5e-620d-41ad-80d7-fc34f0995756-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k8fn7\" (UID: \"6fc99e5e-620d-41ad-80d7-fc34f0995756\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.820631 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.820588 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" Apr 17 11:19:12.857970 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.857932 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cmgrm" Apr 17 11:19:12.966823 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:12.966786 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7"] Apr 17 11:19:12.970286 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:19:12.970252 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc99e5e_620d_41ad_80d7_fc34f0995756.slice/crio-27b102c062aa27a26708739438a3bd7533ebe8cc44c6497cae311baa20b74aa2 WatchSource:0}: Error finding container 27b102c062aa27a26708739438a3bd7533ebe8cc44c6497cae311baa20b74aa2: Status 404 returned error can't find the container with id 27b102c062aa27a26708739438a3bd7533ebe8cc44c6497cae311baa20b74aa2 Apr 17 11:19:13.034177 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.034146 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:19:13.038946 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.038924 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.042267 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.041758 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 11:19:13.042267 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.041835 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 11:19:13.042267 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.041958 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 11:19:13.042267 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.041967 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 11:19:13.042267 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.042014 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 11:19:13.042267 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.042100 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-24jqx\"" Apr 17 11:19:13.042666 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.042335 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 11:19:13.042666 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.042363 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 11:19:13.042666 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.042425 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 11:19:13.042666 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.042527 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 11:19:13.057211 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.057180 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:19:13.113231 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113163 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113231 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113212 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/221d8e81-bcce-4bbe-9c4f-974f2cce276f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113422 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113288 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-config-volume\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113422 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113330 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghggs\" (UniqueName: \"kubernetes.io/projected/221d8e81-bcce-4bbe-9c4f-974f2cce276f-kube-api-access-ghggs\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113422 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113360 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113529 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113436 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113529 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113481 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/221d8e81-bcce-4bbe-9c4f-974f2cce276f-config-out\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113544 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/221d8e81-bcce-4bbe-9c4f-974f2cce276f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113611 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/221d8e81-bcce-4bbe-9c4f-974f2cce276f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113699 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113643 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-web-config\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113738 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113704 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/221d8e81-bcce-4bbe-9c4f-974f2cce276f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113785 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.113785 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.113772 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214309 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214506 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214506 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214372 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/221d8e81-bcce-4bbe-9c4f-974f2cce276f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214506 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-config-volume\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214506 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214421 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghggs\" (UniqueName: \"kubernetes.io/projected/221d8e81-bcce-4bbe-9c4f-974f2cce276f-kube-api-access-ghggs\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214506 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214506 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214810 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/221d8e81-bcce-4bbe-9c4f-974f2cce276f-config-out\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214810 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/221d8e81-bcce-4bbe-9c4f-974f2cce276f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214810 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/221d8e81-bcce-4bbe-9c4f-974f2cce276f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214810 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-web-config\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214810 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214676 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/221d8e81-bcce-4bbe-9c4f-974f2cce276f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.214810 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.215074 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.214807 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/221d8e81-bcce-4bbe-9c4f-974f2cce276f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.215074 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:19:13.214951 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/221d8e81-bcce-4bbe-9c4f-974f2cce276f-alertmanager-trusted-ca-bundle podName:221d8e81-bcce-4bbe-9c4f-974f2cce276f nodeName:}" failed. No retries permitted until 2026-04-17 11:19:13.714926692 +0000 UTC m=+186.631451601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/221d8e81-bcce-4bbe-9c4f-974f2cce276f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "221d8e81-bcce-4bbe-9c4f-974f2cce276f") : configmap references non-existent config key: ca-bundle.crt Apr 17 11:19:13.215582 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.215546 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/221d8e81-bcce-4bbe-9c4f-974f2cce276f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.217621 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.217504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.217743 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.217687 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-config-volume\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.217899 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.217872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/221d8e81-bcce-4bbe-9c4f-974f2cce276f-config-out\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.218180 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.218141 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/221d8e81-bcce-4bbe-9c4f-974f2cce276f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.218255 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.218179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.218527 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.218503 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-web-config\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.218630 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.218572 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.218630 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.218589 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.219321 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.219303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/221d8e81-bcce-4bbe-9c4f-974f2cce276f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.235370 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.235343 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghggs\" (UniqueName: \"kubernetes.io/projected/221d8e81-bcce-4bbe-9c4f-974f2cce276f-kube-api-access-ghggs\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.269418 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.269336 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmgrm" event={"ID":"338b62b9-f284-4a24-8fa8-ae9c7f82ce56","Type":"ContainerStarted","Data":"80a124c6ce616b2c716719967117cebd641c57f80c41baac9e41e3aefd2bc8d4"} Apr 17 11:19:13.271819 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.271674 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" event={"ID":"6fc99e5e-620d-41ad-80d7-fc34f0995756","Type":"ContainerStarted","Data":"2b4f08a6a5c5d8dd69446406e2edc274ecb0145aaea096efe3e837a302e2d1d4"} Apr 17 11:19:13.272045 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.272020 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" event={"ID":"6fc99e5e-620d-41ad-80d7-fc34f0995756","Type":"ContainerStarted","Data":"ba143585ebeff468baf2b294652a9edd11dc74f0467ee5f92333a1a19979387b"} Apr 17 11:19:13.272144 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.272049 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" event={"ID":"6fc99e5e-620d-41ad-80d7-fc34f0995756","Type":"ContainerStarted","Data":"27b102c062aa27a26708739438a3bd7533ebe8cc44c6497cae311baa20b74aa2"} Apr 17 11:19:13.719388 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.719353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/221d8e81-bcce-4bbe-9c4f-974f2cce276f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.720395 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.720370 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/221d8e81-bcce-4bbe-9c4f-974f2cce276f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"221d8e81-bcce-4bbe-9c4f-974f2cce276f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:13.950694 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:13.950656 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:14.122289 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:14.122258 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:19:14.123514 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:19:14.123484 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221d8e81_bcce_4bbe_9c4f_974f2cce276f.slice/crio-e525728fa2279d469c6366ef1fe4def8a7dd3a7aeb0afb10af00dcadfff7b323 WatchSource:0}: Error finding container e525728fa2279d469c6366ef1fe4def8a7dd3a7aeb0afb10af00dcadfff7b323: Status 404 returned error can't find the container with id e525728fa2279d469c6366ef1fe4def8a7dd3a7aeb0afb10af00dcadfff7b323 Apr 17 11:19:14.277071 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:14.276962 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"221d8e81-bcce-4bbe-9c4f-974f2cce276f","Type":"ContainerStarted","Data":"e525728fa2279d469c6366ef1fe4def8a7dd3a7aeb0afb10af00dcadfff7b323"} Apr 17 11:19:14.278969 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:14.278921 2571 generic.go:358] "Generic (PLEG): container finished" podID="338b62b9-f284-4a24-8fa8-ae9c7f82ce56" containerID="89195aeec864b3d3c854ad2671a41b73cc7962e6f7ab9c3e7629f68c92ff345b" exitCode=0 Apr 17 11:19:14.279266 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:14.279054 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmgrm" event={"ID":"338b62b9-f284-4a24-8fa8-ae9c7f82ce56","Type":"ContainerDied","Data":"89195aeec864b3d3c854ad2671a41b73cc7962e6f7ab9c3e7629f68c92ff345b"} Apr 17 11:19:15.284195 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:15.284155 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmgrm" event={"ID":"338b62b9-f284-4a24-8fa8-ae9c7f82ce56","Type":"ContainerStarted","Data":"38178e578fc01d687bee1a98ed5b6ff53d8123a1e93a23048d48221d8e121d05"} Apr 17 11:19:15.284704 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:15.284201 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmgrm" event={"ID":"338b62b9-f284-4a24-8fa8-ae9c7f82ce56","Type":"ContainerStarted","Data":"772bd9117f29e5400759e5fefe413a71fb66a8300052d7a00860e160bd310460"} Apr 17 11:19:15.286985 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:15.286919 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" event={"ID":"6fc99e5e-620d-41ad-80d7-fc34f0995756","Type":"ContainerStarted","Data":"8b5e0aa0b11256850b8f9e0ede2f8aae9924c8fda6a4d1b1bbef295192904243"} Apr 17 11:19:15.315446 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:15.315384 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cmgrm" podStartSLOduration=3.531513125 podStartE2EDuration="4.3153633s" podCreationTimestamp="2026-04-17 11:19:11 +0000 UTC" firstStartedPulling="2026-04-17 11:19:12.869816168 +0000 UTC m=+185.786341086" lastFinishedPulling="2026-04-17 11:19:13.653666343 +0000 UTC m=+186.570191261" observedRunningTime="2026-04-17 11:19:15.31490628 +0000 UTC m=+188.231431206" watchObservedRunningTime="2026-04-17 11:19:15.3153633 +0000 UTC m=+188.231888227" Apr 17 11:19:15.334849 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:15.334788 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k8fn7" podStartSLOduration=2.668781738 podStartE2EDuration="4.334766362s" podCreationTimestamp="2026-04-17 11:19:11 +0000 UTC" firstStartedPulling="2026-04-17 11:19:13.124828641 +0000 UTC m=+186.041353545" lastFinishedPulling="2026-04-17 11:19:14.790813255 +0000 UTC m=+187.707338169" observedRunningTime="2026-04-17 11:19:15.33344681 +0000 UTC m=+188.249971760" watchObservedRunningTime="2026-04-17 11:19:15.334766362 +0000 UTC m=+188.251291288" Apr 17 11:19:16.292072 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.291954 2571 generic.go:358] "Generic (PLEG): container finished" podID="221d8e81-bcce-4bbe-9c4f-974f2cce276f" containerID="1d7cba067d8ca16650a7e640dfbaab3139306553e4a6c8841f978b346efc266c" exitCode=0 Apr 17 11:19:16.292603 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.292061 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"221d8e81-bcce-4bbe-9c4f-974f2cce276f","Type":"ContainerDied","Data":"1d7cba067d8ca16650a7e640dfbaab3139306553e4a6c8841f978b346efc266c"} Apr 17 11:19:16.428647 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.428612 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-574958897f-22qmn"] Apr 17 11:19:16.446102 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.446067 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-574958897f-22qmn"] Apr 17 11:19:16.446297 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.446181 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.449186 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.449148 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-9p5hk\"" Apr 17 11:19:16.449186 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.449148 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 11:19:16.449375 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.449157 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 11:19:16.449687 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.449669 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 11:19:16.449835 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.449809 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7cr681sg34r59\"" Apr 17 11:19:16.449940 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.449778 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 11:19:16.547146 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.547050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8246382-0813-4efe-8aaf-b27bec2d68ca-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.547146 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.547085 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhs97\" (UniqueName: \"kubernetes.io/projected/d8246382-0813-4efe-8aaf-b27bec2d68ca-kube-api-access-rhs97\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.547146 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.547136 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d8246382-0813-4efe-8aaf-b27bec2d68ca-audit-log\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.547433 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.547188 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d8246382-0813-4efe-8aaf-b27bec2d68ca-metrics-server-audit-profiles\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.547433 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.547222 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d8246382-0813-4efe-8aaf-b27bec2d68ca-secret-metrics-server-tls\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.547433 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.547264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d8246382-0813-4efe-8aaf-b27bec2d68ca-secret-metrics-server-client-certs\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.547433 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.547301 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8246382-0813-4efe-8aaf-b27bec2d68ca-client-ca-bundle\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.648104 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.648065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8246382-0813-4efe-8aaf-b27bec2d68ca-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.648307 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.648128 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhs97\" (UniqueName: \"kubernetes.io/projected/d8246382-0813-4efe-8aaf-b27bec2d68ca-kube-api-access-rhs97\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.648307 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.648156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d8246382-0813-4efe-8aaf-b27bec2d68ca-audit-log\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.648307 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.648229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d8246382-0813-4efe-8aaf-b27bec2d68ca-metrics-server-audit-profiles\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.648474 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.648312 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d8246382-0813-4efe-8aaf-b27bec2d68ca-secret-metrics-server-tls\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.648474 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.648365 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d8246382-0813-4efe-8aaf-b27bec2d68ca-secret-metrics-server-client-certs\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.648474 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.648416 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8246382-0813-4efe-8aaf-b27bec2d68ca-client-ca-bundle\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.648617 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.648570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d8246382-0813-4efe-8aaf-b27bec2d68ca-audit-log\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.648925 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.648894 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8246382-0813-4efe-8aaf-b27bec2d68ca-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.649329 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.649282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d8246382-0813-4efe-8aaf-b27bec2d68ca-metrics-server-audit-profiles\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.651415 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.651392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d8246382-0813-4efe-8aaf-b27bec2d68ca-secret-metrics-server-tls\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.651514 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.651391 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8246382-0813-4efe-8aaf-b27bec2d68ca-client-ca-bundle\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.651573 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.651507 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d8246382-0813-4efe-8aaf-b27bec2d68ca-secret-metrics-server-client-certs\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.661382 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.661352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhs97\" (UniqueName: \"kubernetes.io/projected/d8246382-0813-4efe-8aaf-b27bec2d68ca-kube-api-access-rhs97\") pod \"metrics-server-574958897f-22qmn\" (UID: \"d8246382-0813-4efe-8aaf-b27bec2d68ca\") " pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.757472 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.757432 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:16.907245 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:16.907067 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-574958897f-22qmn"] Apr 17 11:19:16.910720 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:19:16.910668 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8246382_0813_4efe_8aaf_b27bec2d68ca.slice/crio-747d5ca0f4a5188c762641170160da598e090aed00b624e32cae804711388563 WatchSource:0}: Error finding container 747d5ca0f4a5188c762641170160da598e090aed00b624e32cae804711388563: Status 404 returned error can't find the container with id 747d5ca0f4a5188c762641170160da598e090aed00b624e32cae804711388563 Apr 17 11:19:17.296711 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:17.296667 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-574958897f-22qmn" event={"ID":"d8246382-0813-4efe-8aaf-b27bec2d68ca","Type":"ContainerStarted","Data":"747d5ca0f4a5188c762641170160da598e090aed00b624e32cae804711388563"} Apr 17 11:19:19.304895 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:19.304860 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"221d8e81-bcce-4bbe-9c4f-974f2cce276f","Type":"ContainerStarted","Data":"9df01b61bacce7a017ee18f2c055d37f8c87cdb1be2cf0534899188abb037442"} Apr 17 11:19:20.310591 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:20.310508 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-574958897f-22qmn" event={"ID":"d8246382-0813-4efe-8aaf-b27bec2d68ca","Type":"ContainerStarted","Data":"9a01a7f30a4054f123482bcc28beaaf9de7ee99762c29715edc8b8fbd612cbb5"} Apr 17 11:19:20.314426 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:20.314351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"221d8e81-bcce-4bbe-9c4f-974f2cce276f","Type":"ContainerStarted","Data":"768077a744ff6912fd8bdb08ae6bd78b1fd4b50cc2e8070d474a59237836a10f"} Apr 17 11:19:20.314426 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:20.314388 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"221d8e81-bcce-4bbe-9c4f-974f2cce276f","Type":"ContainerStarted","Data":"bc6a7944b506ca6c5b2eede41d140d0922088ab7e58e701e48590a41d76978e9"} Apr 17 11:19:20.314426 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:20.314404 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"221d8e81-bcce-4bbe-9c4f-974f2cce276f","Type":"ContainerStarted","Data":"6d0f5b77229a57e8f6f649fb189273ce5484358e4c445974dfda25b42ce09916"} Apr 17 11:19:20.314426 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:20.314426 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"221d8e81-bcce-4bbe-9c4f-974f2cce276f","Type":"ContainerStarted","Data":"2b2621827147b1da3709706e7853b0bc0df0cc60b010985d0b5258e2b1446a52"} Apr 17 11:19:20.332859 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:20.332806 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-574958897f-22qmn" podStartSLOduration=2.026463806 podStartE2EDuration="4.332789586s" podCreationTimestamp="2026-04-17 11:19:16 +0000 UTC" firstStartedPulling="2026-04-17 11:19:16.913196147 +0000 UTC m=+189.829721052" lastFinishedPulling="2026-04-17 11:19:19.219521916 +0000 UTC m=+192.136046832" observedRunningTime="2026-04-17 11:19:20.331179332 +0000 UTC m=+193.247704260" watchObservedRunningTime="2026-04-17 11:19:20.332789586 +0000 UTC m=+193.249314511" Apr 17 11:19:21.319564 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:21.319480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"221d8e81-bcce-4bbe-9c4f-974f2cce276f","Type":"ContainerStarted","Data":"073eba5fb68831e6b50f1d21055fa0ab2a479e4977fb05758c5170b10837be95"} Apr 17 11:19:21.359175 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:21.359101 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.4621344569999999 podStartE2EDuration="8.359087368s" podCreationTimestamp="2026-04-17 11:19:13 +0000 UTC" firstStartedPulling="2026-04-17 11:19:14.125774682 +0000 UTC m=+187.042299602" lastFinishedPulling="2026-04-17 11:19:21.022727609 +0000 UTC m=+193.939252513" observedRunningTime="2026-04-17 11:19:21.352773851 +0000 UTC m=+194.269298811" watchObservedRunningTime="2026-04-17 11:19:21.359087368 +0000 UTC m=+194.275612300" Apr 17 11:19:23.104136 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:23.104054 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" podUID="151d1dc0-4349-4e64-bd4a-f976c47520fb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:19:33.103771 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:33.103733 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" podUID="151d1dc0-4349-4e64-bd4a-f976c47520fb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:19:36.757822 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:36.757778 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:36.757822 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:36.757830 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:43.104561 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:43.104522 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" podUID="151d1dc0-4349-4e64-bd4a-f976c47520fb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:19:43.104987 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:43.104594 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" Apr 17 11:19:43.105066 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:43.105036 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"1e60854e31713e597321e9ccc7a5ca749c75ce388480b08ca5c6499e6850aaf2"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 11:19:43.105110 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:43.105096 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" podUID="151d1dc0-4349-4e64-bd4a-f976c47520fb" containerName="service-proxy" containerID="cri-o://1e60854e31713e597321e9ccc7a5ca749c75ce388480b08ca5c6499e6850aaf2" gracePeriod=30 Apr 17 11:19:43.377744 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:43.377664 2571 generic.go:358] "Generic (PLEG): container finished" podID="151d1dc0-4349-4e64-bd4a-f976c47520fb" containerID="1e60854e31713e597321e9ccc7a5ca749c75ce388480b08ca5c6499e6850aaf2" exitCode=2 Apr 17 11:19:43.377744 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:43.377685 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" event={"ID":"151d1dc0-4349-4e64-bd4a-f976c47520fb","Type":"ContainerDied","Data":"1e60854e31713e597321e9ccc7a5ca749c75ce388480b08ca5c6499e6850aaf2"} Apr 17 11:19:43.377744 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:43.377721 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7459b48476-5jzl4" event={"ID":"151d1dc0-4349-4e64-bd4a-f976c47520fb","Type":"ContainerStarted","Data":"fb3333f7b71441be088b16e7badc2eb167778ff9a735c0c4a4f3bc63fbb2d296"} Apr 17 11:19:56.763102 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:56.763074 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:19:56.767068 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:19:56.767045 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-574958897f-22qmn" Apr 17 11:20:19.449790 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:19.449752 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:20:19.451939 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:19.451903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/39a4801b-dc46-42c1-a6fe-8a2d79362e6b-metrics-certs\") pod \"network-metrics-daemon-6zdmq\" (UID: \"39a4801b-dc46-42c1-a6fe-8a2d79362e6b\") " pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:20:19.685475 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:19.685444 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q4j6t\"" Apr 17 11:20:19.692977 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:19.692960 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6zdmq" Apr 17 11:20:19.812986 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:19.812953 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6zdmq"] Apr 17 11:20:19.821213 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:20:19.821187 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39a4801b_dc46_42c1_a6fe_8a2d79362e6b.slice/crio-970833d439262a3a1fee2da5c3902b05a46316ea0ea04110e423ecfc6544430a WatchSource:0}: Error finding container 970833d439262a3a1fee2da5c3902b05a46316ea0ea04110e423ecfc6544430a: Status 404 returned error can't find the container with id 970833d439262a3a1fee2da5c3902b05a46316ea0ea04110e423ecfc6544430a Apr 17 11:20:20.476014 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:20.475970 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6zdmq" event={"ID":"39a4801b-dc46-42c1-a6fe-8a2d79362e6b","Type":"ContainerStarted","Data":"970833d439262a3a1fee2da5c3902b05a46316ea0ea04110e423ecfc6544430a"} Apr 17 11:20:21.480629 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:21.480584 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6zdmq" event={"ID":"39a4801b-dc46-42c1-a6fe-8a2d79362e6b","Type":"ContainerStarted","Data":"ad7c147677faa6e3e0d4d93c8ef110df721fd59cf88f19794284eb8d9877c06e"} Apr 17 11:20:21.480629 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:21.480631 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6zdmq" event={"ID":"39a4801b-dc46-42c1-a6fe-8a2d79362e6b","Type":"ContainerStarted","Data":"34fd747aaf10d1b962a3cbbebebe56ae0983c388b73f02bca04c7e6c06142cdd"} Apr 17 11:20:21.503469 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:21.503409 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6zdmq" podStartSLOduration=253.595537165 podStartE2EDuration="4m14.50339014s" podCreationTimestamp="2026-04-17 11:16:07 +0000 UTC" firstStartedPulling="2026-04-17 11:20:19.822912209 +0000 UTC m=+252.739437120" lastFinishedPulling="2026-04-17 11:20:20.73076519 +0000 UTC m=+253.647290095" observedRunningTime="2026-04-17 11:20:21.501975661 +0000 UTC m=+254.418500587" watchObservedRunningTime="2026-04-17 11:20:21.50339014 +0000 UTC m=+254.419915066" Apr 17 11:20:36.407864 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.407785 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6fd4575d7d-vrd74"] Apr 17 11:20:36.410964 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.410948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.414408 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.414379 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 11:20:36.414730 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.414707 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 11:20:36.414891 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.414876 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 11:20:36.415180 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.415164 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 11:20:36.415526 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.415513 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-v7lth\"" Apr 17 11:20:36.417818 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.417799 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 11:20:36.420954 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.420936 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 11:20:36.428007 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.427986 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6fd4575d7d-vrd74"] Apr 17 11:20:36.588781 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.588721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf43c23-f3c9-4878-be44-80fb971d7c34-serving-certs-ca-bundle\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.588781 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.588785 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf43c23-f3c9-4878-be44-80fb971d7c34-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.588998 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.588855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mfmp\" (UniqueName: \"kubernetes.io/projected/1cf43c23-f3c9-4878-be44-80fb971d7c34-kube-api-access-9mfmp\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.588998 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.588938 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-federate-client-tls\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.588998 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.588980 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-secret-telemeter-client\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.589182 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.589014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1cf43c23-f3c9-4878-be44-80fb971d7c34-metrics-client-ca\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.589182 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.589037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-telemeter-client-tls\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.589182 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.589087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.690423 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.690320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1cf43c23-f3c9-4878-be44-80fb971d7c34-metrics-client-ca\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.690423 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.690380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-telemeter-client-tls\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.690423 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.690411 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.690676 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.690454 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf43c23-f3c9-4878-be44-80fb971d7c34-serving-certs-ca-bundle\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.690676 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.690604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf43c23-f3c9-4878-be44-80fb971d7c34-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.690676 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.690655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mfmp\" (UniqueName: \"kubernetes.io/projected/1cf43c23-f3c9-4878-be44-80fb971d7c34-kube-api-access-9mfmp\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.690824 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.690717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-federate-client-tls\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.690824 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.690765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-secret-telemeter-client\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.691200 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.691169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf43c23-f3c9-4878-be44-80fb971d7c34-serving-certs-ca-bundle\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.691341 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.691169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1cf43c23-f3c9-4878-be44-80fb971d7c34-metrics-client-ca\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.691518 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.691490 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1cf43c23-f3c9-4878-be44-80fb971d7c34-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.692954 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.692928 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.693182 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.693163 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-secret-telemeter-client\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.693235 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.693168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-telemeter-client-tls\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.693318 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.693302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1cf43c23-f3c9-4878-be44-80fb971d7c34-federate-client-tls\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.699294 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.699268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mfmp\" (UniqueName: \"kubernetes.io/projected/1cf43c23-f3c9-4878-be44-80fb971d7c34-kube-api-access-9mfmp\") pod \"telemeter-client-6fd4575d7d-vrd74\" (UID: \"1cf43c23-f3c9-4878-be44-80fb971d7c34\") " pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.720264 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.720238 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" Apr 17 11:20:36.847869 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:36.847823 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6fd4575d7d-vrd74"] Apr 17 11:20:36.851917 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:20:36.851888 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cf43c23_f3c9_4878_be44_80fb971d7c34.slice/crio-899ff6184d9eb3375470306e7ac5b76ed32f5eb7f4fa5648ed2f4db337c0103c WatchSource:0}: Error finding container 899ff6184d9eb3375470306e7ac5b76ed32f5eb7f4fa5648ed2f4db337c0103c: Status 404 returned error can't find the container with id 899ff6184d9eb3375470306e7ac5b76ed32f5eb7f4fa5648ed2f4db337c0103c Apr 17 11:20:37.526477 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:37.526440 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" event={"ID":"1cf43c23-f3c9-4878-be44-80fb971d7c34","Type":"ContainerStarted","Data":"899ff6184d9eb3375470306e7ac5b76ed32f5eb7f4fa5648ed2f4db337c0103c"} Apr 17 11:20:39.533756 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:39.533717 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" event={"ID":"1cf43c23-f3c9-4878-be44-80fb971d7c34","Type":"ContainerStarted","Data":"ee918af77fb3d4afef86af2a491ee9076182e27ca31752abaaede46217022858"} Apr 17 11:20:39.533756 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:39.533753 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" event={"ID":"1cf43c23-f3c9-4878-be44-80fb971d7c34","Type":"ContainerStarted","Data":"4b279eea0db84ca68c20ec0f84e37682f733e6a7eb193278cea81fc9a157c20a"} Apr 17 11:20:39.533756 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:39.533764 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" event={"ID":"1cf43c23-f3c9-4878-be44-80fb971d7c34","Type":"ContainerStarted","Data":"db4e85ae705a2fe5b40d05e19f548d78d67ddc3cf5828a6e2893eb41c47cc218"} Apr 17 11:20:39.558724 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:20:39.558668 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6fd4575d7d-vrd74" podStartSLOduration=1.433139325 podStartE2EDuration="3.558653551s" podCreationTimestamp="2026-04-17 11:20:36 +0000 UTC" firstStartedPulling="2026-04-17 11:20:36.853847931 +0000 UTC m=+269.770372840" lastFinishedPulling="2026-04-17 11:20:38.979362146 +0000 UTC m=+271.895887066" observedRunningTime="2026-04-17 11:20:39.556748121 +0000 UTC m=+272.473273046" watchObservedRunningTime="2026-04-17 11:20:39.558653551 +0000 UTC m=+272.475178476" Apr 17 11:21:07.565525 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:21:07.565483 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/0.log" Apr 17 11:21:07.567866 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:21:07.567838 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/0.log" Apr 17 11:21:07.572096 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:21:07.572077 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:22:52.249369 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.249335 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jww9h"] Apr 17 11:22:52.252381 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.252364 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.256520 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.256497 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 11:22:52.256626 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.256533 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 11:22:52.256626 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.256545 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 11:22:52.256712 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.256657 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-flcr6\"" Apr 17 11:22:52.256712 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.256666 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 11:22:52.257764 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.257750 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 11:22:52.267675 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.267655 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jww9h"] Apr 17 11:22:52.317343 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.317320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.317440 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.317383 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-cabundle0\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.317440 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.317403 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgl8h\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-kube-api-access-pgl8h\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.418094 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.418066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-cabundle0\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.418094 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.418097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgl8h\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-kube-api-access-pgl8h\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.418260 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.418146 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.418295 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:52.418260 2571 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:22:52.418295 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:52.418270 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:22:52.418295 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:52.418278 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jww9h: references non-existent secret key: ca.crt Apr 17 11:22:52.418469 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:52.418439 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates podName:b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:52.918317586 +0000 UTC m=+405.834842491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates") pod "keda-operator-ffbb595cb-jww9h" (UID: "b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6") : references non-existent secret key: ca.crt Apr 17 11:22:52.418713 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.418694 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-cabundle0\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.427141 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.427106 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgl8h\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-kube-api-access-pgl8h\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.922446 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:52.922414 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:52.922596 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:52.922558 2571 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:22:52.922596 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:52.922571 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:22:52.922596 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:52.922579 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jww9h: references non-existent secret key: ca.crt Apr 17 11:22:52.922697 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:52.922638 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates podName:b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:53.922624815 +0000 UTC m=+406.839149719 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates") pod "keda-operator-ffbb595cb-jww9h" (UID: "b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6") : references non-existent secret key: ca.crt Apr 17 11:22:53.930340 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:53.930305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:53.930702 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:53.930422 2571 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:22:53.930702 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:53.930434 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:22:53.930702 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:53.930444 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jww9h: references non-existent secret key: ca.crt Apr 17 11:22:53.930702 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:53.930502 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates podName:b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:55.930488463 +0000 UTC m=+408.847013368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates") pod "keda-operator-ffbb595cb-jww9h" (UID: "b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6") : references non-existent secret key: ca.crt Apr 17 11:22:55.946152 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:55.946096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:55.946504 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:55.946251 2571 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:22:55.946504 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:55.946272 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:22:55.946504 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:55.946281 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jww9h: references non-existent secret key: ca.crt Apr 17 11:22:55.946504 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:22:55.946340 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates podName:b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:59.946326103 +0000 UTC m=+412.862851008 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates") pod "keda-operator-ffbb595cb-jww9h" (UID: "b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6") : references non-existent secret key: ca.crt Apr 17 11:22:59.984256 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:59.984198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:22:59.986584 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:22:59.986561 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6-certificates\") pod \"keda-operator-ffbb595cb-jww9h\" (UID: \"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6\") " pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:23:00.061569 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:23:00.061540 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:23:00.178401 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:23:00.178369 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jww9h"] Apr 17 11:23:00.180795 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:23:00.180769 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb95d57ef_c446_4eec_a5f5_73a1f1c9f9f6.slice/crio-5ef05625dfa12684f03397f4384e23b1cf4870bb840b92bf5a4ab222ad781706 WatchSource:0}: Error finding container 5ef05625dfa12684f03397f4384e23b1cf4870bb840b92bf5a4ab222ad781706: Status 404 returned error can't find the container with id 5ef05625dfa12684f03397f4384e23b1cf4870bb840b92bf5a4ab222ad781706 Apr 17 11:23:00.182269 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:23:00.182251 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:23:00.916523 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:23:00.916468 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jww9h" event={"ID":"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6","Type":"ContainerStarted","Data":"5ef05625dfa12684f03397f4384e23b1cf4870bb840b92bf5a4ab222ad781706"} Apr 17 11:23:03.926194 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:23:03.926158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jww9h" event={"ID":"b95d57ef-c446-4eec-a5f5-73a1f1c9f9f6","Type":"ContainerStarted","Data":"02c1e2ef3ccbf2aa73294a1064477831ebdc66e96020b832ead32cd47ee0399f"} Apr 17 11:23:03.926581 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:23:03.926284 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:23:03.951277 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:23:03.951224 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-jww9h" podStartSLOduration=8.97215661 podStartE2EDuration="11.951209427s" podCreationTimestamp="2026-04-17 11:22:52 +0000 UTC" firstStartedPulling="2026-04-17 11:23:00.182377563 +0000 UTC m=+413.098902468" lastFinishedPulling="2026-04-17 11:23:03.16143038 +0000 UTC m=+416.077955285" observedRunningTime="2026-04-17 11:23:03.94949793 +0000 UTC m=+416.866022867" watchObservedRunningTime="2026-04-17 11:23:03.951209427 +0000 UTC m=+416.867734353" Apr 17 11:23:24.936521 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:23:24.936444 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-jww9h" Apr 17 11:24:01.418759 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.418726 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz"] Apr 17 11:24:01.421762 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.421746 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:01.424480 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.424457 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 11:24:01.424585 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.424493 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 11:24:01.424729 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.424713 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 11:24:01.425735 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.425719 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-9dtgb\"" Apr 17 11:24:01.433658 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.433635 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz"] Apr 17 11:24:01.496494 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.496464 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b649d64-2feb-412a-9be9-cdb6ae8ec34a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dj4bz\" (UID: \"1b649d64-2feb-412a-9be9-cdb6ae8ec34a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:01.496664 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.496511 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9glm\" (UniqueName: \"kubernetes.io/projected/1b649d64-2feb-412a-9be9-cdb6ae8ec34a-kube-api-access-t9glm\") pod \"llmisvc-controller-manager-68cc5db7c4-dj4bz\" (UID: \"1b649d64-2feb-412a-9be9-cdb6ae8ec34a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:01.597541 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.597498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9glm\" (UniqueName: \"kubernetes.io/projected/1b649d64-2feb-412a-9be9-cdb6ae8ec34a-kube-api-access-t9glm\") pod \"llmisvc-controller-manager-68cc5db7c4-dj4bz\" (UID: \"1b649d64-2feb-412a-9be9-cdb6ae8ec34a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:01.597717 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.597586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b649d64-2feb-412a-9be9-cdb6ae8ec34a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dj4bz\" (UID: \"1b649d64-2feb-412a-9be9-cdb6ae8ec34a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:01.597717 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:24:01.597702 2571 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 17 11:24:01.597806 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:24:01.597795 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b649d64-2feb-412a-9be9-cdb6ae8ec34a-cert podName:1b649d64-2feb-412a-9be9-cdb6ae8ec34a nodeName:}" failed. No retries permitted until 2026-04-17 11:24:02.09777496 +0000 UTC m=+475.014299873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b649d64-2feb-412a-9be9-cdb6ae8ec34a-cert") pod "llmisvc-controller-manager-68cc5db7c4-dj4bz" (UID: "1b649d64-2feb-412a-9be9-cdb6ae8ec34a") : secret "llmisvc-webhook-server-cert" not found Apr 17 11:24:01.606672 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:01.606641 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9glm\" (UniqueName: \"kubernetes.io/projected/1b649d64-2feb-412a-9be9-cdb6ae8ec34a-kube-api-access-t9glm\") pod \"llmisvc-controller-manager-68cc5db7c4-dj4bz\" (UID: \"1b649d64-2feb-412a-9be9-cdb6ae8ec34a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:02.101976 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:02.101947 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b649d64-2feb-412a-9be9-cdb6ae8ec34a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dj4bz\" (UID: \"1b649d64-2feb-412a-9be9-cdb6ae8ec34a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:02.104342 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:02.104320 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b649d64-2feb-412a-9be9-cdb6ae8ec34a-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dj4bz\" (UID: \"1b649d64-2feb-412a-9be9-cdb6ae8ec34a\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:02.331184 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:02.331154 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:02.445023 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:02.444986 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz"] Apr 17 11:24:02.447127 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:24:02.447088 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1b649d64_2feb_412a_9be9_cdb6ae8ec34a.slice/crio-279916a3051f57a397e266a713f8478f894da973609d942c3775351118638ad7 WatchSource:0}: Error finding container 279916a3051f57a397e266a713f8478f894da973609d942c3775351118638ad7: Status 404 returned error can't find the container with id 279916a3051f57a397e266a713f8478f894da973609d942c3775351118638ad7 Apr 17 11:24:03.098206 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:03.098168 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" event={"ID":"1b649d64-2feb-412a-9be9-cdb6ae8ec34a","Type":"ContainerStarted","Data":"279916a3051f57a397e266a713f8478f894da973609d942c3775351118638ad7"} Apr 17 11:24:05.105796 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:05.105760 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" event={"ID":"1b649d64-2feb-412a-9be9-cdb6ae8ec34a","Type":"ContainerStarted","Data":"dba1990417be342db26d88e18104d9ab5715fcbd8e3493e77af4e354c63382b8"} Apr 17 11:24:05.106330 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:05.105878 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:24:05.124539 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:05.124484 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" podStartSLOduration=2.293474361 podStartE2EDuration="4.124469554s" podCreationTimestamp="2026-04-17 11:24:01 +0000 UTC" firstStartedPulling="2026-04-17 11:24:02.44834034 +0000 UTC m=+475.364865244" lastFinishedPulling="2026-04-17 11:24:04.279335521 +0000 UTC m=+477.195860437" observedRunningTime="2026-04-17 11:24:05.123046117 +0000 UTC m=+478.039571044" watchObservedRunningTime="2026-04-17 11:24:05.124469554 +0000 UTC m=+478.040994479" Apr 17 11:24:36.111716 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:24:36.111684 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dj4bz" Apr 17 11:25:10.313984 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.313905 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-spqn5"] Apr 17 11:25:10.315909 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.315892 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:10.318971 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.318948 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 11:25:10.318971 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.318963 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-jw57j\"" Apr 17 11:25:10.329271 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.329247 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-spqn5"] Apr 17 11:25:10.348146 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.348088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwdxl\" (UniqueName: \"kubernetes.io/projected/f91c60cf-2c64-4a95-8086-ac4e069d0227-kube-api-access-mwdxl\") pod \"odh-model-controller-696fc77849-spqn5\" (UID: \"f91c60cf-2c64-4a95-8086-ac4e069d0227\") " pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:10.348315 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.348198 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f91c60cf-2c64-4a95-8086-ac4e069d0227-cert\") pod \"odh-model-controller-696fc77849-spqn5\" (UID: \"f91c60cf-2c64-4a95-8086-ac4e069d0227\") " pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:10.448975 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.448935 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwdxl\" (UniqueName: \"kubernetes.io/projected/f91c60cf-2c64-4a95-8086-ac4e069d0227-kube-api-access-mwdxl\") pod \"odh-model-controller-696fc77849-spqn5\" (UID: \"f91c60cf-2c64-4a95-8086-ac4e069d0227\") " pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:10.449144 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.449001 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f91c60cf-2c64-4a95-8086-ac4e069d0227-cert\") pod \"odh-model-controller-696fc77849-spqn5\" (UID: \"f91c60cf-2c64-4a95-8086-ac4e069d0227\") " pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:10.449144 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:25:10.449108 2571 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 11:25:10.449231 ip-10-0-141-16 kubenswrapper[2571]: E0417 11:25:10.449189 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91c60cf-2c64-4a95-8086-ac4e069d0227-cert podName:f91c60cf-2c64-4a95-8086-ac4e069d0227 nodeName:}" failed. No retries permitted until 2026-04-17 11:25:10.949169929 +0000 UTC m=+543.865694839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f91c60cf-2c64-4a95-8086-ac4e069d0227-cert") pod "odh-model-controller-696fc77849-spqn5" (UID: "f91c60cf-2c64-4a95-8086-ac4e069d0227") : secret "odh-model-controller-webhook-cert" not found Apr 17 11:25:10.459372 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.459347 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwdxl\" (UniqueName: \"kubernetes.io/projected/f91c60cf-2c64-4a95-8086-ac4e069d0227-kube-api-access-mwdxl\") pod \"odh-model-controller-696fc77849-spqn5\" (UID: \"f91c60cf-2c64-4a95-8086-ac4e069d0227\") " pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:10.953714 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.953667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f91c60cf-2c64-4a95-8086-ac4e069d0227-cert\") pod \"odh-model-controller-696fc77849-spqn5\" (UID: \"f91c60cf-2c64-4a95-8086-ac4e069d0227\") " pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:10.956160 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:10.956109 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f91c60cf-2c64-4a95-8086-ac4e069d0227-cert\") pod \"odh-model-controller-696fc77849-spqn5\" (UID: \"f91c60cf-2c64-4a95-8086-ac4e069d0227\") " pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:11.225872 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:11.225769 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:11.343113 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:11.343067 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-spqn5"] Apr 17 11:25:11.345946 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:25:11.345918 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91c60cf_2c64_4a95_8086_ac4e069d0227.slice/crio-7acf1fe169fb96e3cd22805504f42cbf09878986db322bf1b2c8013e01811586 WatchSource:0}: Error finding container 7acf1fe169fb96e3cd22805504f42cbf09878986db322bf1b2c8013e01811586: Status 404 returned error can't find the container with id 7acf1fe169fb96e3cd22805504f42cbf09878986db322bf1b2c8013e01811586 Apr 17 11:25:12.298930 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:12.298883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-spqn5" event={"ID":"f91c60cf-2c64-4a95-8086-ac4e069d0227","Type":"ContainerStarted","Data":"7acf1fe169fb96e3cd22805504f42cbf09878986db322bf1b2c8013e01811586"} Apr 17 11:25:14.306581 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:14.306486 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-spqn5" event={"ID":"f91c60cf-2c64-4a95-8086-ac4e069d0227","Type":"ContainerStarted","Data":"9624e7423af34e894f0645c9e1576b9f7b5bf651bd874535e7d3196a0821e2b5"} Apr 17 11:25:14.306988 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:14.306590 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:25:14.334709 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:14.334624 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-spqn5" podStartSLOduration=1.8389253220000001 podStartE2EDuration="4.334605711s" podCreationTimestamp="2026-04-17 11:25:10 +0000 UTC" firstStartedPulling="2026-04-17 11:25:11.347176787 +0000 UTC m=+544.263701691" lastFinishedPulling="2026-04-17 11:25:13.842857174 +0000 UTC m=+546.759382080" observedRunningTime="2026-04-17 11:25:14.333494968 +0000 UTC m=+547.250019894" watchObservedRunningTime="2026-04-17 11:25:14.334605711 +0000 UTC m=+547.251130636" Apr 17 11:25:25.311399 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:25:25.311364 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-spqn5" Apr 17 11:26:06.804386 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:06.804350 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xkw4c/must-gather-h8fbf"] Apr 17 11:26:06.807658 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:06.807640 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkw4c/must-gather-h8fbf" Apr 17 11:26:06.820489 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:06.820466 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xkw4c\"/\"default-dockercfg-vtcwd\"" Apr 17 11:26:06.820489 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:06.820482 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkw4c\"/\"kube-root-ca.crt\"" Apr 17 11:26:06.820791 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:06.820768 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkw4c\"/\"openshift-service-ca.crt\"" Apr 17 11:26:06.822319 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:06.822299 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkw4c/must-gather-h8fbf"] Apr 17 11:26:06.913511 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:06.913477 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15c39245-19c0-4186-9cef-8ed04d27ad9c-must-gather-output\") pod \"must-gather-h8fbf\" (UID: \"15c39245-19c0-4186-9cef-8ed04d27ad9c\") " pod="openshift-must-gather-xkw4c/must-gather-h8fbf" Apr 17 11:26:06.913694 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:06.913535 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsdvw\" (UniqueName: \"kubernetes.io/projected/15c39245-19c0-4186-9cef-8ed04d27ad9c-kube-api-access-dsdvw\") pod \"must-gather-h8fbf\" (UID: \"15c39245-19c0-4186-9cef-8ed04d27ad9c\") " pod="openshift-must-gather-xkw4c/must-gather-h8fbf" Apr 17 11:26:07.014262 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:07.014218 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsdvw\" (UniqueName: \"kubernetes.io/projected/15c39245-19c0-4186-9cef-8ed04d27ad9c-kube-api-access-dsdvw\") pod \"must-gather-h8fbf\" (UID: \"15c39245-19c0-4186-9cef-8ed04d27ad9c\") " pod="openshift-must-gather-xkw4c/must-gather-h8fbf" Apr 17 11:26:07.014408 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:07.014305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15c39245-19c0-4186-9cef-8ed04d27ad9c-must-gather-output\") pod \"must-gather-h8fbf\" (UID: \"15c39245-19c0-4186-9cef-8ed04d27ad9c\") " pod="openshift-must-gather-xkw4c/must-gather-h8fbf" Apr 17 11:26:07.014607 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:07.014592 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/15c39245-19c0-4186-9cef-8ed04d27ad9c-must-gather-output\") pod \"must-gather-h8fbf\" (UID: \"15c39245-19c0-4186-9cef-8ed04d27ad9c\") " pod="openshift-must-gather-xkw4c/must-gather-h8fbf" Apr 17 11:26:07.023235 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:07.023205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsdvw\" (UniqueName: \"kubernetes.io/projected/15c39245-19c0-4186-9cef-8ed04d27ad9c-kube-api-access-dsdvw\") pod \"must-gather-h8fbf\" (UID: \"15c39245-19c0-4186-9cef-8ed04d27ad9c\") " pod="openshift-must-gather-xkw4c/must-gather-h8fbf" Apr 17 11:26:07.115701 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:07.115659 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkw4c/must-gather-h8fbf" Apr 17 11:26:07.239135 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:07.239080 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkw4c/must-gather-h8fbf"] Apr 17 11:26:07.455327 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:07.455238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkw4c/must-gather-h8fbf" event={"ID":"15c39245-19c0-4186-9cef-8ed04d27ad9c","Type":"ContainerStarted","Data":"7c21068a4eb9323b5a1c153ef0ffcfd0ff6d1c4d84a52da09a440e2a42ff0b1b"} Apr 17 11:26:07.588746 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:07.588719 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/0.log" Apr 17 11:26:07.589465 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:07.589435 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/0.log" Apr 17 11:26:08.459982 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:08.459937 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkw4c/must-gather-h8fbf" event={"ID":"15c39245-19c0-4186-9cef-8ed04d27ad9c","Type":"ContainerStarted","Data":"91e4f8ed4c55cfa2b9aa9259b1a3c427770b6c2a0e91521df0c515c11f0f9449"} Apr 17 11:26:08.459982 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:08.459983 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkw4c/must-gather-h8fbf" event={"ID":"15c39245-19c0-4186-9cef-8ed04d27ad9c","Type":"ContainerStarted","Data":"6bd19bca5ede3ca72f0b66931ad149d612c6383a567aaf547e40e34e4b7255d0"} Apr 17 11:26:08.476234 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:08.476186 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xkw4c/must-gather-h8fbf" podStartSLOduration=1.712839325 podStartE2EDuration="2.476169707s" podCreationTimestamp="2026-04-17 11:26:06 +0000 UTC" firstStartedPulling="2026-04-17 11:26:07.243376806 +0000 UTC m=+600.159901714" lastFinishedPulling="2026-04-17 11:26:08.006707179 +0000 UTC m=+600.923232096" observedRunningTime="2026-04-17 11:26:08.474853625 +0000 UTC m=+601.391378552" watchObservedRunningTime="2026-04-17 11:26:08.476169707 +0000 UTC m=+601.392694633" Apr 17 11:26:09.513524 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:09.513487 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-n25xj_75fcf97e-549f-48be-9a3a-142bc6a20eaa/global-pull-secret-syncer/0.log" Apr 17 11:26:09.562563 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:09.562533 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5pxkv_670fcc7b-8343-46ca-b1b5-00040742a8e8/konnectivity-agent/0.log" Apr 17 11:26:09.717750 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:09.717716 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-16.ec2.internal_237efac7542ae805317afa8331e5e27b/haproxy/0.log" Apr 17 11:26:13.013710 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.013675 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_221d8e81-bcce-4bbe-9c4f-974f2cce276f/alertmanager/0.log" Apr 17 11:26:13.037098 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.037061 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_221d8e81-bcce-4bbe-9c4f-974f2cce276f/config-reloader/0.log" Apr 17 11:26:13.062987 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.062955 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_221d8e81-bcce-4bbe-9c4f-974f2cce276f/kube-rbac-proxy-web/0.log" Apr 17 11:26:13.091247 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.091217 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_221d8e81-bcce-4bbe-9c4f-974f2cce276f/kube-rbac-proxy/0.log" Apr 17 11:26:13.115138 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.115092 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_221d8e81-bcce-4bbe-9c4f-974f2cce276f/kube-rbac-proxy-metric/0.log" Apr 17 11:26:13.139674 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.139640 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_221d8e81-bcce-4bbe-9c4f-974f2cce276f/prom-label-proxy/0.log" Apr 17 11:26:13.161690 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.161664 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_221d8e81-bcce-4bbe-9c4f-974f2cce276f/init-config-reloader/0.log" Apr 17 11:26:13.327873 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.327804 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-574958897f-22qmn_d8246382-0813-4efe-8aaf-b27bec2d68ca/metrics-server/0.log" Apr 17 11:26:13.467374 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.467334 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmgrm_338b62b9-f284-4a24-8fa8-ae9c7f82ce56/node-exporter/0.log" Apr 17 11:26:13.487057 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.487032 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmgrm_338b62b9-f284-4a24-8fa8-ae9c7f82ce56/kube-rbac-proxy/0.log" Apr 17 11:26:13.514075 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.514050 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmgrm_338b62b9-f284-4a24-8fa8-ae9c7f82ce56/init-textfile/0.log" Apr 17 11:26:13.626106 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.626076 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k8fn7_6fc99e5e-620d-41ad-80d7-fc34f0995756/kube-rbac-proxy-main/0.log" Apr 17 11:26:13.648978 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.648937 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k8fn7_6fc99e5e-620d-41ad-80d7-fc34f0995756/kube-rbac-proxy-self/0.log" Apr 17 11:26:13.673241 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.673210 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k8fn7_6fc99e5e-620d-41ad-80d7-fc34f0995756/openshift-state-metrics/0.log" Apr 17 11:26:13.982079 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:13.981974 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6fd4575d7d-vrd74_1cf43c23-f3c9-4878-be44-80fb971d7c34/telemeter-client/0.log" Apr 17 11:26:14.003812 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:14.003780 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6fd4575d7d-vrd74_1cf43c23-f3c9-4878-be44-80fb971d7c34/reload/0.log" Apr 17 11:26:14.025828 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:14.025782 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6fd4575d7d-vrd74_1cf43c23-f3c9-4878-be44-80fb971d7c34/kube-rbac-proxy/0.log" Apr 17 11:26:16.145599 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.145552 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-gbwjs_c68c2a92-99c5-42b3-be90-a59b191fd1cd/download-server/0.log" Apr 17 11:26:16.729039 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.728998 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm"] Apr 17 11:26:16.733802 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.733780 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.780490 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.780457 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm"] Apr 17 11:26:16.804164 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.804103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-lib-modules\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.804329 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.804205 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-sys\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.804329 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.804270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-podres\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.804502 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.804475 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-proc\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.804644 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.804561 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfc2j\" (UniqueName: \"kubernetes.io/projected/869b3ea3-ab64-47b9-8041-a3e51b866947-kube-api-access-bfc2j\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.905984 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.905934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-proc\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.906173 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.906003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfc2j\" (UniqueName: \"kubernetes.io/projected/869b3ea3-ab64-47b9-8041-a3e51b866947-kube-api-access-bfc2j\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.906173 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.906057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-lib-modules\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.906173 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.906090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-sys\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.906336 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.906259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-podres\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.906336 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.906323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-proc\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.906658 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.906524 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-lib-modules\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.906658 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.906569 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-podres\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.906658 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.906651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/869b3ea3-ab64-47b9-8041-a3e51b866947-sys\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:16.915921 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:16.915893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfc2j\" (UniqueName: \"kubernetes.io/projected/869b3ea3-ab64-47b9-8041-a3e51b866947-kube-api-access-bfc2j\") pod \"perf-node-gather-daemonset-hz4cm\" (UID: \"869b3ea3-ab64-47b9-8041-a3e51b866947\") " pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:17.047375 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.047292 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:17.189995 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.189848 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm"] Apr 17 11:26:17.193759 ip-10-0-141-16 kubenswrapper[2571]: W0417 11:26:17.193719 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod869b3ea3_ab64_47b9_8041_a3e51b866947.slice/crio-80fecadc87e189d82dea65e4d06943d42b4b43a524b46b8c8bf9fe5da0821b66 WatchSource:0}: Error finding container 80fecadc87e189d82dea65e4d06943d42b4b43a524b46b8c8bf9fe5da0821b66: Status 404 returned error can't find the container with id 80fecadc87e189d82dea65e4d06943d42b4b43a524b46b8c8bf9fe5da0821b66 Apr 17 11:26:17.313566 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.313490 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7zgfs_87464ef4-c119-49ba-bee1-c792066f9cd0/dns/0.log" Apr 17 11:26:17.334795 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.334768 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7zgfs_87464ef4-c119-49ba-bee1-c792066f9cd0/kube-rbac-proxy/0.log" Apr 17 11:26:17.448397 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.448372 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k8djp_88f53a5c-9a8b-457b-9e6e-e62bf112bbb8/dns-node-resolver/0.log" Apr 17 11:26:17.499501 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.499469 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" event={"ID":"869b3ea3-ab64-47b9-8041-a3e51b866947","Type":"ContainerStarted","Data":"448593c53d9db493a4022eb16f113a0ac04a54b4ddae6f9a25adc404ae93a0fd"} Apr 17 11:26:17.499501 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.499506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" event={"ID":"869b3ea3-ab64-47b9-8041-a3e51b866947","Type":"ContainerStarted","Data":"80fecadc87e189d82dea65e4d06943d42b4b43a524b46b8c8bf9fe5da0821b66"} Apr 17 11:26:17.499723 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.499539 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:17.517416 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.517355 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" podStartSLOduration=1.5173341900000001 podStartE2EDuration="1.51733419s" podCreationTimestamp="2026-04-17 11:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:26:17.516279791 +0000 UTC m=+610.432804718" watchObservedRunningTime="2026-04-17 11:26:17.51733419 +0000 UTC m=+610.433859117" Apr 17 11:26:17.954653 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:17.954619 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lqhrs_06811185-7a8c-419b-9f44-d67b67d794d3/node-ca/0.log" Apr 17 11:26:19.106441 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:19.106409 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kz66m_f0b90299-ec82-4706-837f-42097067ec57/serve-healthcheck-canary/0.log" Apr 17 11:26:19.483275 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:19.483202 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fk6ct_a6af175a-d4db-493f-a569-e21c7304b8de/kube-rbac-proxy/0.log" Apr 17 11:26:19.504142 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:19.504085 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fk6ct_a6af175a-d4db-493f-a569-e21c7304b8de/exporter/0.log" Apr 17 11:26:19.524915 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:19.524887 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fk6ct_a6af175a-d4db-493f-a569-e21c7304b8de/extractor/0.log" Apr 17 11:26:21.645528 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:21.645459 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-dj4bz_1b649d64-2feb-412a-9be9-cdb6ae8ec34a/manager/0.log" Apr 17 11:26:21.699730 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:21.699683 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-spqn5_f91c60cf-2c64-4a95-8086-ac4e069d0227/manager/0.log" Apr 17 11:26:23.515631 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:23.515556 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xkw4c/perf-node-gather-daemonset-hz4cm" Apr 17 11:26:27.199210 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.199179 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d76ls_a61e6396-9d01-4767-84e7-6240ed2764cc/kube-multus-additional-cni-plugins/0.log" Apr 17 11:26:27.221038 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.221008 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d76ls_a61e6396-9d01-4767-84e7-6240ed2764cc/egress-router-binary-copy/0.log" Apr 17 11:26:27.240948 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.240923 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d76ls_a61e6396-9d01-4767-84e7-6240ed2764cc/cni-plugins/0.log" Apr 17 11:26:27.261423 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.261391 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d76ls_a61e6396-9d01-4767-84e7-6240ed2764cc/bond-cni-plugin/0.log" Apr 17 11:26:27.282362 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.282337 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d76ls_a61e6396-9d01-4767-84e7-6240ed2764cc/routeoverride-cni/0.log" Apr 17 11:26:27.302306 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.302271 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d76ls_a61e6396-9d01-4767-84e7-6240ed2764cc/whereabouts-cni-bincopy/0.log" Apr 17 11:26:27.323463 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.323436 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d76ls_a61e6396-9d01-4767-84e7-6240ed2764cc/whereabouts-cni/0.log" Apr 17 11:26:27.583551 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.583527 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zc47d_7a6e582c-8fc4-4d48-a9f1-63fa4e09787a/kube-multus/0.log" Apr 17 11:26:27.605364 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.605321 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6zdmq_39a4801b-dc46-42c1-a6fe-8a2d79362e6b/network-metrics-daemon/0.log" Apr 17 11:26:27.625713 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:27.625671 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6zdmq_39a4801b-dc46-42c1-a6fe-8a2d79362e6b/kube-rbac-proxy/0.log" Apr 17 11:26:28.995039 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:28.995008 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-controller/0.log" Apr 17 11:26:29.012033 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:29.011999 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/0.log" Apr 17 11:26:29.017241 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:29.017219 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovn-acl-logging/1.log" Apr 17 11:26:29.041143 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:29.041099 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/kube-rbac-proxy-node/0.log" Apr 17 11:26:29.063579 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:29.063545 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:26:29.080030 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:29.080000 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/northd/0.log" Apr 17 11:26:29.102749 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:29.102719 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/nbdb/0.log" Apr 17 11:26:29.123064 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:29.123036 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/sbdb/0.log" Apr 17 11:26:29.304691 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:29.304606 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmrcg_1fa09637-267c-4a4b-8aac-54287c81cc4e/ovnkube-controller/0.log" Apr 17 11:26:30.335644 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:30.335611 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-w7npk_c41a7def-7809-48c2-80fa-7078299705ca/network-check-target-container/0.log" Apr 17 11:26:31.278869 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:31.278844 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-npwhk_e8a849a2-893a-4e45-b82e-22ee8ac74d6e/iptables-alerter/0.log" Apr 17 11:26:31.967133 ip-10-0-141-16 kubenswrapper[2571]: I0417 11:26:31.967098 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-gh4f5_f1cd366b-5311-44d3-af2a-8b067cf4f65a/tuned/0.log"